Upgrade to Pro — share decks privately, control downloads, hide ads and more …

Logging with Elastichsearch Logstash Kibana

1768d238acf404dadd501ba424d65bfd?s=47 dknx01
December 01, 2015

Logging with Elastichsearch Logstash Kibana

How to manage your logs (Apache access and error log, syslog and auth-log) with the help of logstash, elasticsearch and Kibana.
With Redis as a buffer

1768d238acf404dadd501ba424d65bfd?s=128

dknx01

December 01, 2015
Tweet

Transcript

  1. dknx01@yahoo.de Logstash-Elasticsearch-Kibana How to manage logs by dknx01 December 1,

    2015
  2. dknx01@yahoo.de Why logging • Debugging

  3. dknx01@yahoo.de Why logging • Debugging • Metrics

  4. dknx01@yahoo.de Why logging • Debugging • Metrics • Monitoring

  5. dknx01@yahoo.de Old style • Tail: ssh example.org > tail -f

    /var/log/some.log
  6. dknx01@yahoo.de Old style • Tail: ssh example.org > tail -f

    /var/log/some.log • Tools for multiple les: like multitail
  7. dknx01@yahoo.de Old style • Tail: ssh example.org > tail -f

    /var/log/some.log • Tools for multiple les: like multitail • Run command synchron in multiple ssh sessions
  8. dknx01@yahoo.de Old style • Tail: ssh example.org > tail -f

    /var/log/some.log • Tools for multiple les: like multitail • Run command synchron in multiple ssh sessions But for more than one le/server or autmatic statistics:
  9. dknx01@yahoo.de Old style • Tail: ssh example.org > tail -f

    /var/log/some.log • Tools for multiple les: like multitail • Run command synchron in multiple ssh sessions But for more than one le/server or autmatic statistics:
  10. dknx01@yahoo.de Better style Better all in one place with option

    to later analysis
  11. dknx01@yahoo.de The ELK-Stack E lasticsearch - Searchserver for indexing the

    data (NoSQL-DB)
  12. dknx01@yahoo.de The ELK-Stack E lasticsearch - Searchserver for indexing the

    data (NoSQL-DB) L ogstash - Log data processor for transform and lter the data
  13. dknx01@yahoo.de The ELK-Stack E lasticsearch - Searchserver for indexing the

    data (NoSQL-DB) L ogstash - Log data processor for transform and lter the data K ibana - WebUI for data visualisation and analysis (node.js based)
  14. dknx01@yahoo.de The infrastructure 1. Read the logs and put them

    into a Redis-DB 2. Read from Redis-DB, lter and put into Elasticsearch
  15. dknx01@yahoo.de The infrastructure Why 2 steps? • Logs will be

    read even if Elasticsearch is not active
  16. dknx01@yahoo.de The infrastructure Why 2 steps? • Logs will be

    read even if Elasticsearch is not active • Monitor Redis to see how many events are there (e.g. per second)
  17. dknx01@yahoo.de The infrastructure Why 2 steps? • Logs will be

    read even if Elasticsearch is not active • Monitor Redis to see how many events are there (e.g. per second) • Check the event format if we have some index problems (e.g. wrong eld value or tag)
  18. dknx01@yahoo.de Setup Logstash • Install Java (<1.9)

  19. dknx01@yahoo.de Setup Logstash • Install Java (<1.9) • Download Logstash

    from https://www.elastic.co/downloads/logstash
  20. dknx01@yahoo.de Setup Logstash • Install Java (<1.9) • Download Logstash

    from https://www.elastic.co/downloads/logstash • Extract the zip le
  21. dknx01@yahoo.de Setup Logstash • Install Java (<1.9) • Download Logstash

    from https://www.elastic.co/downloads/logstash • Extract the zip le • Run it: bin/logstash -f logstash.conf (see cong le below)
  22. dknx01@yahoo.de Setup Logstash • Install Java (<1.9) • Download Logstash

    from https://www.elastic.co/downloads/logstash • Extract the zip le • Run it: bin/logstash -f logstash.conf (see cong le below) • Or install the deb package and run it
  23. dknx01@yahoo.de Setup Redis • Install Redis from your repository system

  24. dknx01@yahoo.de Setup Elasticsearch • Install Java (<1.9) if not done

    yet
  25. dknx01@yahoo.de Setup Elasticsearch • Install Java (<1.9) if not done

    yet • Download Elasticsearch from https://www.elastic.co/downloads/elasticsearch
  26. dknx01@yahoo.de Setup Elasticsearch • Install Java (<1.9) if not done

    yet • Download Elasticsearch from https://www.elastic.co/downloads/elasticsearch • Extract the zip le
  27. dknx01@yahoo.de Setup Elasticsearch • Install Java (<1.9) if not done

    yet • Download Elasticsearch from https://www.elastic.co/downloads/elasticsearch • Extract the zip le • Run it: bin/elasticsearch
  28. dknx01@yahoo.de Setup Elasticsearch • Install Java (<1.9) if not done

    yet • Download Elasticsearch from https://www.elastic.co/downloads/elasticsearch • Extract the zip le • Run it: bin/elasticsearch • Or install the deb package and run it
  29. dknx01@yahoo.de Setup Kibana • Install Java (<1.9) if not done

    yet
  30. dknx01@yahoo.de Setup Kibana • Install Java (<1.9) if not done

    yet • Download Kibana from https://www.elastic.co/downloads/kibana
  31. dknx01@yahoo.de Setup Kibana • Install Java (<1.9) if not done

    yet • Download Kibana from https://www.elastic.co/downloads/kibana • Extract the zip le
  32. dknx01@yahoo.de Setup Kibana • Install Java (<1.9) if not done

    yet • Download Kibana from https://www.elastic.co/downloads/kibana • Extract the zip le • Open cong/kibana.yml in editor • Set the elasticsearch.url to point at your Elasticsearch instance (e.g. loclhost or 1270.0.1) • Run it: bin/kibana • Open url http://yourhost.com:5601
  33. dknx01@yahoo.de Cong Shipper For the Shipper we create a cong

    le: 1 input { 2 f i l e { 3 path => "/ var / log / apache2 /∗ access ∗. log " 4 s t a r t _ p o s i t i o n => beginning 5 type => apache 6 sincedb_path => "/ opt /. sincedb_apache_access " 7 } 8 } 9 output { 10 r e d i s { 11 host => " 1 2 7 . 0 . 0 . 1 " 12 data_type => " l i s t " 13 key => " l o g s t a s h " 14 } 15 }
  34. dknx01@yahoo.de Cong Shipper explained input {...} Conguration for our input

    le {...} Species a le input (all apache access log les) path Path to our log les (regex) start_position We start reading the le from the beginning type adds a eld "type" with value "apache" to the output sincedb_path Path to the internal database that sores the last reading position in this le(s) output {...} Conguration for our ouput redis {...} Conguration for redis output host Redis host address data_type Specied that we store the events as a list in redis key Name of our redis list
  35. dknx01@yahoo.de Cong Indexer For the Shipper we create a cong

    le: 1 input { 2 r e d i s { 3 host => " 1 2 7 . 0 . 0 . 1 " 4 type => " r e d i s −input " 5 data_type => " l i s t " 6 key => " l o g s t a s h " 7 } 8 } 9 f i l t e r { 10 i f [ path ] =~ " access " { ANALYSE APACHE ACCESS } 11 e l s e i f [ path ] =~ " e r r o r " { ANALYSE APACHE ERROR } 12 e l s e i f [ type ] == " s y s l o g " { ANALYSE SYSLOG } 13 e l s e i f [ type ] == " auth " { ANALYSE AUTH LOG } 14 } 15 output { 16 e l a s t i c s e a r c h { } 17 }
  36. dknx01@yahoo.de Cong Indexer explained input {...} Conguration for our input

    redis {...} Conguration for redis input host Redis host address type adds a eld "type" with value "redis-list" to the output data_type Specied that we store the events as a list in redis key Name of our redis list) lter {...} Our lter for the dierent events (syslog, apache error, apache access, auth) if [path|type ] Separate lter congurations for our events (see later) output {...} Conguration for elasticsearch output elasticsearch{ } Default conguration for elasticsearch (localhost, no further conguration needed)
  37. dknx01@yahoo.de Cong - Indexer Apache Access Filter The Apache Access

    Filter: 1 mutate { 2 r e p l a c e => { type => " apache_access " } 3 remove_tag => [ " _ g r o k p a r s e f a i l u r e " ] 4 remove_field => [ " tags " , " tag " , " path " ] 5 } 6 grok { 7 patterns_dir => "/ opt / grok_patterns " 8 match => { "message" => "%{VHOSTCOMBINEDAPACHELOG}" } 9 } 10 date { 11 match => [ " timestamp " , "dd/M M M/ yyyy :HH:mm: ss Z" ] 12 } 13 geoip { 14 source => " c l i e n t i p " 15 } 16 useragent { 17 source => " agent " 18 }
  38. dknx01@yahoo.de Cong - Indexer Apache Access Filter mutate {...} Change

    eld values replace Replace value of eld "type" with "apache_access" remove_tag List of tags to be removed remove_eld List of eld to be removed grok {...} Parese text and structure it pattern_dir Path to our pattern les, if we don't use the internal ones match Field and pattern for matching date {...} Analyse the "timestamp" eld geoip Analyse the eld "clientip" with geoip (city, region, ip, etc.) useragent Analyse the eld "agent" as browser user agent (OS, Major- and Minor-version browsername, etc.)
  39. dknx01@yahoo.de Cong - Indexer Apache Error Filter The Apache Access

    Filter: 1 grok { 2 patterns_dir => "/ opt / grok_patterns " 3 match => { "message" => "%{APACHERERROR}" } 4 } 5 m u l t i l i n e { 6 pattern => "^PHP\ \b( Notice | Warning | Error | Fatal )\b \: " 7 s o u r c e => " errorMessage " 8 what => " next " 9 } 10 m u l t i l i n e { 11 pattern => "^PHP[\ ]{3 ,}\ d+\.\ .∗ " 12 s o u r c e => " errorMessage " 13 what => " p r e v i o u s " 14 } 15 mutate { 16 r e p l a c e => { type => " apache_error " } 17 r e p l a c e => { message => "%{errorMessage }" } 18 . . . 19 } 20 geoip { 21 s o u r c e => " c l i e n t I p " 22 } 23 i f [ request ] == "/ feed " { 24 drop {} 25 }
  40. dknx01@yahoo.de Cong - Indexer Apache Error Filter grok {...} Parese

    text and structure it pattern_dir Path to our pattern les match Field and pattern for matching multiline{...} Detect if we have a multiline message pattern The detection pattern source The eld for detection what How to handle it (next =combine with next/previous message) mutate {...} Change eld values replace Replace value of eld "type" with "apache_error" and "message" with value of "errorMessage" geoip Analyse the eld "clientip" with geoip request if the eld "request" has the value "/feed" drop it, we don't need it anymore
  41. dknx01@yahoo.de Cong - Indexer Syslog/Auth Filter The Apache Access Filter:

    1 grok { 2 match => { "message" => "%{SYSLOGT}" } 3 add_field => [ " received_at " , "%{@timestamp}" ] 4 } 5 s y s l o g _ p r i { }}
  42. dknx01@yahoo.de Cong - Indexer Syslog/Auth Filter grok {...} Parese text

    and structure it pattern_dir Path to our pattern les match Field and pattern for matching add_eld add an additional eld syslog_prio {...} Handle syslog priority levels
  43. dknx01@yahoo.de Conclusion • With these cong le and two running

    logstash instances we have the log in elasticsearch
  44. dknx01@yahoo.de Conclusion • With these cong le and two running

    logstash instances we have the log in elasticsearch • Kibana can be used for graphs and analyses
  45. dknx01@yahoo.de Kibana Combined apache error entry

  46. dknx01@yahoo.de Kibana Access graph

  47. dknx01@yahoo.de Kibana Access cities, browser and devices

  48. dknx01@yahoo.de End That's all For more infos just search for

    Kibana, Logstash or Elasticsearch