ELK之LogStash插件grok和geoip的配置使用

本文针对LogStash常用插件grok和geoip的使用进行说明:

一、使用grok输出结构化数据

编辑 first-pipeline.conf 文件,修改为如下内容:

bash 复制代码
input{
  #stdin{type => stdin}
  file {
    # 读取文件的路径
    path => ["/tmp/access.log"]
    start_position => "beginning"
  }
}

filter{
  grok{
    match => {"message" => "%{COMBINEDAPACHELOG}" }
  }

}

output{
  stdout{codec => rubydebug}
}

启动./logstash -f ../config/first-pipeline.conf后输出就为结构化的数据了:

bash 复制代码
{
        "message" => "140.77.188.102 - - [25/Jun/2022:05:11:33 +0800] \"GET /api/ss/api/v1/login/getBaseUrl HTTP/1.1\" 200 103 \"-\" \"Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/534+ (KHTML, like Gecko) BingPreview/1.0b\"",
       "response" => "200",
           "auth" => "-",
          "bytes" => "103",
       "referrer" => "\"-\"",
           "host" => "nb002",
       "@version" => "1",
          "agent" => "\"Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/534+ (KHTML, like Gecko) BingPreview/1.0b\"",
     "@timestamp" => 2022-06-26T00:28:24.302Z,
      "timestamp" => "25/Jun/2022:05:11:33 +0800",
          "ident" => "-",
    "httpversion" => "1.1",
           "path" => "/tmp/access.log",
       "clientip" => "140.77.188.102",
           "verb" => "GET",
        "request" => "/api/ss/api/v1/login/getBaseUrl"
}

二、使用grok对输出数据进行修改

编辑 first-pipeline.conf 文件,修改为如下内容:

bash 复制代码
input{
  #stdin{type => stdin}
  file {
    path => ["/tmp/access.log"]
    start_position => "beginning"
  }
}

filter{
  grok{
    match => {"message" => "%{COMBINEDAPACHELOG}" }
  }
  mutate{
    # 重命名字段
    rename => {"clientip" => "cip"}
  }
  mutate{
    # 移出特定字段
    remove_field => ["timestamp","agent"]
  }
}

output{
  stdout{codec => rubydebug}
}

重新启动./logstash -f ../config/first-pipeline.conf 后,往 /tmp/access.log 中新增一条数据,看输出:发现"clientip" 变成了 "cip" 和timestamp agent 字段已经没有了。NICE

bash 复制代码
{
           "verb" => "GET",
     "@timestamp" => 2022-06-26T00:48:28.224Z,
       "referrer" => "\"-\"",
           "path" => "/tmp/access.log",
           "auth" => "-",
        "message" => "140.77.188.102 - - [25/Jun/2022:05:11:33 +0800] \"GET /api/ss/api/v1/login/getBaseUrl HTTP/1.1\" 200 103 \"-\" \"Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/534+ (KHTML, like Gecko) BingPreview/1.0b\"",
       "@version" => "1",
          "ident" => "-",
       "response" => "200",
          "bytes" => "103",
        "request" => "/api/ss/api/v1/login/getBaseUrl",
    "httpversion" => "1.1",
           "host" => "nb002",
            "cip" => "140.77.188.102"
}

三、使用geoip过滤器插件

使用geoip过滤器插件,可以增强数据。

geoip插件可以针对IP地址进行地理位置信息来源的查找

编辑 first-pipeline.conf 文件,修改为如下内容:

bash 复制代码
input{
  #stdin{type => stdin}
  file {
    path => ["/tmp/access.log"]
    start_position => "beginning"
  }
}

filter{
  grok{
    match => {"message" => "%{COMBINEDAPACHELOG}" }
  }
  mutate{
    # 重命名字段
    rename => {"clientip" => "cip"}
  }
  mutate{
    # 移出特定字段
    remove_field => ["timestamp","agent"]
  }
  geoip{
    # 由于上面将clientip修改为了cip,故此处配置cip,如果没有rename字段则用clientip
    source => "cip"
  }
}

output{
  stdout{codec => rubydebug}
}

重新启动./logstash -f ../config/first-pipeline.conf 后,往 /tmp/access.log 中新增一条数据,看输出:发现输出结果中新增了geoip 字段,并展示了地区、国家、省份、经纬度等地理位置信息。

外国ip示例:

bash 复制代码
{
           "host" => "nb002",
           "auth" => "-",
          "bytes" => "103",
            "cip" => "140.77.188.104",
       "@version" => "1",
        "message" => "140.77.188.104 - - [25/Jun/2022:05:11:33 +0800] \"GET /api/ss/api/v1/login/getBaseUrl HTTP/1.1\" 200 103 \"-\" \"Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/534+ (KHTML, like Gecko) BingPreview/1.0b\"",
           "verb" => "GET",
        "request" => "/api/ss/api/v1/login/getBaseUrl",
       "referrer" => "\"-\"",
       "response" => "200",
          "ident" => "-",
           "path" => "/tmp/access.log",
     "@timestamp" => 2022-06-26T00:58:11.786Z,
          "geoip" => {
	         "country_code3" => "FR",
	             "longitude" => 4.85,
	                    "ip" => "140.77.188.104",
	        "continent_code" => "EU",
	           "region_name" => "Rhône",
	         "country_code2" => "FR",
	              "timezone" => "Europe/Paris",
	          "country_name" => "France",
	           "region_code" => "69",
	              "latitude" => 45.748,
	           "postal_code" => "69007",
	              "location" => {
	            "lat" => 45.748,
	            "lon" => 4.85
        },
             "city_name" => "Lyon"
    },
    "httpversion" => "1.1"
}

国内ip示例:

bash 复制代码
{
           "host" => "nb002",
           "auth" => "-",
          "bytes" => "103",
            "cip" => "175.30.108.241",
       "@version" => "1",
        "message" => "175.30.108.241 - - [25/Jun/2022:05:11:33 +0800] \"GET /api/ss/api/v1/login/getBaseUrl HTTP/1.1\" 200 103 \"-\" \"Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/534+ (KHTML, like Gecko) BingPreview/1.0b\"",
           "verb" => "GET",
        "request" => "/api/ss/api/v1/login/getBaseUrl",
       "referrer" => "\"-\"",
       "response" => "200",
          "ident" => "-",
           "path" => "/tmp/access.log",
     "@timestamp" => 2022-06-26T01:00:11.972Z,
          "geoip" => {
         "country_code3" => "CN",
             "longitude" => 125.3247,
                    "ip" => "175.30.108.241",
        "continent_code" => "AS",
           "region_name" => "Jilin",
         "country_code2" => "CN",
              "timezone" => "Asia/Shanghai",
          "country_name" => "China",
           "region_code" => "JL",
              "latitude" => 43.88,
              "location" => {
            "lat" => 43.88,
            "lon" => 125.3247
        },
             "city_name" => "Changchun"
    },
    "httpversion" => "1.1"
}

END

相关推荐
龙胖不下锅1 天前
elk 安装
elk
小诺大人1 天前
【超详细】ELK实现日志采集(日志文件、springboot服务项目)进行实时日志采集上报
spring boot·后端·elk·logstash
小馋喵知识杂货铺1 天前
ELK介绍
elk
S-X-S1 天前
项目集成ELK
java·开发语言·elk
飞的肖1 天前
日志(elk stack)基础语法学习,零基础学习
学习·elk
小诺大人2 天前
Docker 安装 elk(elasticsearch、logstash、kibana)、ES安装ik分词器
elk·elasticsearch·docker
S-X-S2 天前
ELK环境搭建
运维·elk
快乐就好ya6 天前
基于docker微服务日志ELK+Kafka搭建
spring boot·elk·spring cloud·docker·微服务·kafka
a_lllk7 天前
使用docker-compose安装ELK(elasticsearch,logstash,kibana)并简单使用
elk·elasticsearch·docker
运维小文11 天前
filestream安装使用全套+filebeat的模块用法
linux·elk·日志·kibana·efk·filebeat