Logstash - Enrich IP with Geolocation using Maxmind GeoLite2 City and ISP
Today we wanted to parse some json logs which we had in a file using Logstash and enrich them with Geolocation information regarding the city and the ISP an IP belongs. The file, (let’s call it /var/log/input-geo.json
) had the following structure:
{"name":"Christos","src_ip":"63.145.248.101","age":12}
{"name":"Nikos","src_ip":"98.158.156.175","age":10}
Logstash has GeoIP filter which adds information about the geographical location of IP addresses, based on data from the Maxmind GeoLite2 databases. In our case we used:
- GeoLite2 City database (free)
- GeoIP2 ISP Database (commercial licence)
We wanted to parse the JSON file enrich the src_ip
and then forward it to Elasticsearch. For debugging purposes we also enabled file output. Thus , the configuration (logstash.conf
) looked like the following:
input {
file {
path => ["/var/log/input-geo.json"]
sincedb_path => "/usr/share/logstash/geo-sincedb1"
start_position => "beginning"
codec => "json"
type => "mygeo"
}
}
filter {
# Enhance with City geolocation information using free GeoLite2 City database
if [src_ip] {
geoip {
source => "src_ip"
target => "src_geoip"
database => "/usr/share/logstash/GeoLite2-City.mmdb"
}
}
# Enhance with ISP geolocation information using free GeoIP2 ISP Database
if [src_ip] {
geoip {
source => "src_ip"
target => "src_geoip"
database => "/usr/share/logstash/GeoIP2-ISP.mmdb"
}
}
output {
# stdout { codec => rubydebug }
# Output successfull messages to a file
if "_grokparsefailure" not in [tags] {
file {
path => "/usr/share/logstash/success-debug.txt"
codec => json_lines
# codec => rubydebug
}
}
# Output failed messages to a file
if "_grokparsefailure" in [tags] {
file {
path => "/usr/share/logstash/failed-debug.txt"
codec => rubydebug
}
}
# Output successfull messages to Elasticsearch
elasticsearch {
hosts => ["192.168.1.3:9200"]
index => "json_logs-%{+YYYY.MM.dd}"
index => "192.168.1.3_login_events-%{+YYYY.MM.dd}"
manage_template => true
template_name => "login_events"
template_overwrite => true
protocol => http
flush_size => 512
workers => 8
}
}
To run Logstash we chose the quickest way, hence run it in Docker , so we have put all required Logstash configuration, logs and Maxmind databases in a directory:
linux@linux-VM:~$ ls -l
-rwxrwxr-x 1 linux linux 26331174 Aug 6 19:11 GeoIP2-ISP.mmdb
-rwxrwxr-x 1 linux linux 51469823 Aug 6 19:11 GeoLite2-City.mmdb
-rw-rw-r-- 1 linux linux 107 Aug 13 19:13 input-geo.json
-rwxrwxr-x 1 linux linux 2244 Aug 13 19:32 logstash.conf
Running Logstash on Docker is relatively easy as Docker images for are available from the Elastic Docker registry. You can find more information here. Let’s run and interactive Docker container with Logstash 6.3.2:
docker run -it \
--rm \
--name logstash \
-v $(pwd)/GeoIP2-ISP.mmdb:/usr/share/logstash/GeoIP2-ISP.mmdb \
-v $(pwd)/GeoLite2-City.mmdb:/usr/share/logstash/GeoLite2-City.mmdb \
-v $(pwd)/input-geo.json:/var/log/input-geo.json \
-v $(pwd)/logstash.conf:/usr/share/logstash/pipeline/logstash.conf \
docker.elastic.co/logstash/logstash-oss:6.3.2
While Logstash is running, if you examine Elasticsearch or /usr/share/logstash/success-debug.txt
, you will notice that the messages contain a lot of geolocation information and they will resemble to the following:
{
"src_ip": "63.145.248.101",
"path": "/var/log/input-geo.json",
"host": "a14a2394ba12",
"@timestamp": "2018-08-13T17:33:05.894Z",
"type": "mygeo",
"src_geoip": {
"postal_code": "94804",
"country_code3": "US",
"dma_code": 807,
"continent_code": "NA",
"region_name": "California",
"longitude": -122.3437,
"ip": "63.145.248.101",
"isp": "CenturyLink",
"region_code": "CA",
"latitude": 37.9255,
"as_org": "Qwest Communications Company, LLC",
"country_code2": "US",
"asn": 209,
"location": {
"lon": -122.3437,
"lat": 37.9255
},
"organization": "CenturyLink",
"city_name": "Richmond",
"country_name": "United States",
"timezone": "America/Los_Angeles"
},
"name": "Christos",
"age": 12,
"@version": "1"
}
Thank God we have a geolocation result! We hope this article helped you get up and running!
Comments