For this example I have generated log as json format. I have used this to generate json data set. This is the sample data set entry.
1 | {“id”:1,”first_name”:”Freeman”,”last_name”:”Jowers”,”email”:”fjowers0@mashable.com”,”gender”:”Male”,”ip_address”:”15.128.77.162",”latitude”:9.9004655,”longitude”:13.0544185,”date”:”2017–10–29T17:47:59Z”,”country”:”Nigeria”}
|
Logstash Configuration
First I will show the logstash configuration I have used for read custom json logs & describe section by section.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 | input { file{ path => [“/data/sample-data.json”] type => “json” start_position => “beginning” } } filter { grok { match => [‘message’,’(?<body>\”id\”:.*\”country\”:\”[^”]+\”)’] add_field => [“json_body”,”{%{body}}”] } json { source => “json_body” remove_field => [“message”,”body”,”json_body”] } mutate{ add_field => [“[geoip][location]”,”%{[latitude]}”] add_field => [“[geoip][location]”,”%{[longitude]}”] } mutate{ convert => [“[geoip][location]”,”float”] } } output { stdout { codec => rubydebug } elasticsearch { hosts => [“localhost:9200”] } } |
In logstash configuration three config sections
- Input
- Filter
- OutPut
Logstash configurations build on filter based . There are currently 200 plugins available . In above configuration Input section describe how the event source can connect to logstash. In here event coming from file.
Filter section describe how the event are manipulated. In my case I have used two plugins to manipulate data. grok plugin convert separate json data entry. And mutate plugin convert longitude and latitude data to geo_point data.
Output section describes where the data need to output. It can be stream processing system [kafka] or database or may be to another logstash input. In above scenario set output as elastic search engine.
Read Logs Using Logstash
1 | ./logstash -f config-file.conf |
Read Full article