最新EFK(Elasticsearch+FileBeat+Kibana)日志收集

文章目录

1.EFK介绍

Kibana参考官方文档:https://www.elastic.co/guide/en/beats/filebeat/8.15/filebeat-overview.html

2.操作前提

需要提前安装Elasticsearch+Kibana。安装过程见我上一个文章:elasticsearch8.15 高可用集群搭建(含认证&Kibana)

3.FileBeat8.15下载&安装

软件下载地址https://www.elastic.co/downloads/past-releases/filebeat-8-15-0

下载文件后上传至服务器

bash 复制代码
# 创建目录
mkdir -p /opt/software/
#解压至特定目录
tar -zxvf filebeat-8.15.0-linux-x86_64.tar.gz -C /opt/software/
#切到filebeat安装目录
cd /opt/software/filebeat-8.15.0-linux-x86_64

4.编写FileBeat配置文件

:FileBeat配置文在FileBeat安装目录下的filebeat.yml。我这里是重新创建一个配置文件,启动FileBeat时指定我自己创建的配置文件。你可以看情况来。

bash 复制代码
# 创建配置目录
mkdir config
# filebeat_2_es.yml默认是不存在的,编辑保持后,该文件会自动创建
vim config/filebeat_2_es.yml

filebeat_2_es.yml配置文件完整内容如下:

bash 复制代码
filebeat.inputs:			#数据输入相关配置
  - type: filestream		# (必填)数据输入类型是filestream
    id: nginx-access-log-1	# (必填)全局唯一即可
    enabled: true			# 启动
    paths:
      - /tmp/logs/nginx_access.log*		# 当前要监听的文件, *是通配符任意。例如nginx.access.log.1 的文件内容也会被监听到。
    tags: ["nginx-access-log"]			# (可选)这里定义用户自定义值。用于下文的索引条件选择 以及 后续kibana上也可以用该标签做数据过滤
    fields:
      my_server_ip: 192.168.25.31   # (可选)这里定义用户自定义属性,后面会出现在采集数据中
    fields_under_root: true		 	# (可选)让上面Field里面定义的属性出现在采集数据的最外层,而不是嵌套在采集数据的filed字段里面。
    #exclude_lines: ['^DBG']  		# (可选)当前数据行如果以DBG开头,则当前数据行不会被采集
    #include_lines: ['^ERR', '^WARN']   # (可选)当前数据行以"ERR"或"WAR"开头,采可能会被采集
    
  - type: filestream
    id: nginx-error-log-1
    enabled: true
    paths:
      - /tmp/logs/nginx_error.log*
    tags: ["nginx-error-log"]
    fields:
      my_server_ip: 192.168.25.31  
    fields_under_root: true
    #exclude_lines: ['^DBG']  #要求当前数据行必须不能以DBG开头
    include_lines: ['\[error\]']   #要求当前数据行必须以"ERR"或"WAR"开头

  - type: filestream
    id: elasticsearch-log-1
    enabled: true
    paths:
      - /tmp/logs/elasticsearch.log*
    tags: ["elasticsearch-log"]
    fields:
      my_server_ip: 192.168.25.31  
    fields_under_root: true
    parsers:
      - multiline:			#多行匹配。例如 java异常日志,一般一个异常会占用多行,此时需要把一个异常下的多行日志合成一行日志。参考:https://www.elastic.co/guide/en/beats/filebeat/8.15/multiline-examples.html
          type: pattern
          pattern: '^\['	# 匹配模式。每一行日志是以 "["开头。
          negate: true		# 这个可以固定,参考:https://www.elastic.co/guide/en/beats/filebeat/8.15/multiline-examples.html
          match: after		# 这个可以固定,参考:https://www.elastic.co/guide/en/beats/filebeat/8.15/multiline-examples.html
#output.console:
#  pretty: true
output.elasticsearch:
  enabled: true
  hosts: ["http://192.168.25.31:9200", "http://192.168.25.32:9200", "http://192.168.25.33:9200"]
  username: "elastic"
  password: "123456"
  indices:
    - index: "log-software-nginx-access-%{+yyyy.MM.dd}" 
      when.contains:
        tags: "nginx-access-log"
    - index: "log-software-nginx-error-%{+yyyy.MM.dd}" 
      when.contains:
        tags: "nginx-error-log"
    - index: "log-software-elasticsearch-%{+yyyy.MM.dd}" 
      when.contains:
        tags: "elasticsearch-log"
#必须把ilm索引生命周期管理关掉(否则上面的indecis索引条件、以及下面的setup.template.pattern将会生效)
setup.ilm.enabled: false
#默认索引名称是filebeat,如果想上面自定义索引上面,需要设置 模板name和匹配。pattern必须能匹配 上面设置的索引名称
setup.template.name: "log"
setup.template.pattern: "log*"
setup.template.overwrite: false
setup.template.settings:
  index.number_of_shards: 3		# 索引在ES的分片数
  index.number_of_replicas: 1	# 每个索引的分片在ES的副本数

5.启动FileBeat

bash 复制代码
./filebeat -e -c config/filebeat_2_es.yml

小技巧::filebeat可以采集已有文件的数据,也可以采集实时往文件里面写入的数据。文件里的数据一旦被filebeat采集后,后面就不会对应已经采集过的数据内容进行重新采集。如果你想每次启动filebeat的时候,能重新采集已经采集过的文件数据,那么可以在启动前执行rm -rf /opt/software/filebeat-8.15.0-linux-x86_64/data/*",然后再启动filebeat,执行会重新采集已有文件的数据。这个data目录下记录的是filebeat的已采集过的采集信息,故而删掉后,filebeat获取不到之前的采集记录,故而会重新对已有文件进行重新采集。

6.模拟实时日志数据生成

模拟Nginx access日志

bash 复制代码
cat > nginx_access.log << 'EOF'
192.168.25.83 - - [31/Jan/2025:21:42:22 +0000] "\x03\x00\x00/*\xE0\x00\x00\x00\x00\x00Cookie: mstshash=Administr" 400 157 "-" "-" "-"
192.168.25.83 - - [01/Feb/2025:11:18:18 +0000] "GET / HTTP/1.1" 400 255 "-" "Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:130.0) Gecko/20100101 Firefox/130.0" "-"

EOF

模拟Nginx error日志

bash 复制代码
cat > nginx_error.log << 'EOF'
2025/01/13 19:15:52 [error] 8#8: *634461 "/usr/local/nginx/html/index.html" is not found (2: No such file or directory), client: 172.168.110.83, server: localhost, request: "GET / HTTP/1.1", host: "192.168.25.10:80"
2025/01/17 17:37:05 [error] 10#10: *770125 "/usr/local/nginx/html/index.html" is not found (2: No such file or directory), client: 172.168.110.83, server: localhost, request: "GET / HTTP/1.1", host: "192.168.25.10:80"
我不是错误数据日志行,看是否会把我给过滤掉
 [error]我是错误消息
 [error]我是错误消息2
EOF

模拟ES日志(含异常:后面验证是否会把异常的多行日志合在一起)

bash 复制代码
cat > elasticsearch.log << 'EOF'
[2025-02-02T01:20:03,049][INFO ][o.e.c.m.MetadataIndexTemplateService] [node-2] adding index template [metrics-apm.transaction.1m@template] for index patterns [metrics-apm.transaction.1m-*]
[2025-02-02T01:20:03,113][INFO ][o.e.c.m.MetadataIndexTemplateService] [node-2] adding index template [metrics-apm.service_destination.60m@template] for index patterns [metrics-apm.service_destination.60m-*]
[2025-02-02T01:20:03,186][INFO ][o.e.c.m.MetadataIndexTemplateService] [node-2] adding index template [metrics-apm.service_transaction.60m@template] for index patterns [metrics-apm.service_transaction.60m-*]
[2025-02-02T01:20:03,232][INFO ][o.e.c.m.MetadataIndexTemplateService] [node-2] adding index template [traces-apm.rum@template] for index patterns [traces-apm.rum-*]
[2025-02-02T01:20:03,319][INFO ][o.e.c.m.MetadataIndexTemplateService] [node-2] adding index template [metrics-apm.service_destination.10m@template] for index patterns [metrics-apm.service_destination.10m-*]
[2025-02-02T01:20:03,383][INFO ][o.e.c.m.MetadataIndexTemplateService] [node-2] adding index template [metrics-apm.service_transaction.10m@template] for index patterns [metrics-apm.service_transaction.10m-*]
[2025-02-02T01:20:03,501][INFO ][o.e.c.m.MetadataIndexTemplateService] [node-2] adding index template [metrics-apm.transaction.60m@template] for index patterns [metrics-apm.transaction.60m-*]
[2025-02-02T01:20:03,582][INFO ][o.e.c.m.MetadataIndexTemplateService] [node-2] adding index template [metrics-apm.app@template] for index patterns [metrics-apm.app.*-*]
[2025-02-02T01:20:03,677][INFO ][o.e.c.m.MetadataIndexTemplateService] [node-2] adding index template [traces-apm@template] for index patterns [traces-apm-*]
[2025-02-02T01:20:03,735][INFO ][o.e.c.m.MetadataIndexTemplateService] [node-2] adding index template [logs-apm.app@template] for index patterns [logs-apm.app.*-*]
[2025-02-02T01:20:03,850][INFO ][o.e.c.m.MetadataIndexTemplateService] [node-2] adding index template [traces-apm.sampled@template] for index patterns [traces-apm.sampled-*]
[2025-02-02T01:20:03,943][INFO ][o.e.c.m.MetadataIndexTemplateService] [node-2] adding index template [metrics-apm.service_destination.1m@template] for index patterns [metrics-apm.service_destination.1m-*]
[2025-02-02T01:20:04,317][ERROR][o.e.x.c.t.IndexTemplateRegistry] [node-2] error adding ingest pipeline template [ent-search-generic-ingestion] for [enterprise_search]
java.lang.IllegalStateException: Ingest info is empty
        at org.elasticsearch.ingest.IngestService.validatePipeline(IngestService.java:648) ~[elasticsearch-8.15.0.jar:?]
        at org.elasticsearch.ingest.IngestService.validatePipelineRequest(IngestService.java:465) ~[elasticsearch-8.15.0.jar:?]
        at org.elasticsearch.ingest.IngestService.lambda$putPipeline$5(IngestService.java:448) ~[elasticsearch-8.15.0.jar:?]
        at org.elasticsearch.action.ActionListenerImplementations$ResponseWrappingActionListener.onResponse(ActionListenerImplementations.java:245) ~[elasticsearch-8.15.0.jar:?]
        at org.elasticsearch.action.support.ContextPreservingActionListener.onResponse(ContextPreservingActionListener.java:32) ~[elasticsearch-8.15.0.jar:?]
        at org.elasticsearch.tasks.TaskManager$1.onResponse(TaskManager.java:202) ~[elasticsearch-8.15.0.jar:?]
        at org.elasticsearch.tasks.TaskManager$1.onResponse(TaskManager.java:196) ~[elasticsearch-8.15.0.jar:?]
        at org.elasticsearch.action.ActionListenerImplementations$RunBeforeActionListener.onResponse(ActionListenerImplementations.java:307) ~[elasticsearch-8.15.0.jar:?]
        at org.elasticsearch.action.support.ContextPreservingActionListener.onResponse(ContextPreservingActionListener.java:32) ~[elasticsearch-8.15.0.jar:?]
        at org.elasticsearch.action.ActionListenerImplementations$MappedActionListener.onResponse(ActionListenerImplementations.java:95) ~[elasticsearch-8.15.0.jar:?]
        at org.elasticsearch.action.ActionListener.respondAndRelease(ActionListener.java:367) ~[elasticsearch-8.15.0.jar:?]
        at org.elasticsearch.action.support.nodes.TransportNodesAction.lambda$newResponseAsync$2(TransportNodesAction.java:215) ~[elasticsearch-8.15.0.jar:?]
        at org.elasticsearch.action.ActionListener.run(ActionListener.java:444) ~[elasticsearch-8.15.0.jar:?]
        at org.elasticsearch.action.support.nodes.TransportNodesAction.newResponseAsync(TransportNodesAction.java:215) ~[elasticsearch-8.15.0.jar:?]
        at org.elasticsearch.action.support.nodes.TransportNodesAction$1.lambda$onCompletion$4(TransportNodesAction.java:166) ~[elasticsearch-8.15.0.jar:?]
        at org.elasticsearch.action.support.nodes.TransportNodesAction.lambda$doExecute$0(TransportNodesAction.java:178) ~[elasticsearch-8.15.0.jar:?]
        at org.elasticsearch.action.ActionListenerImplementations$ResponseWrappingActionListener.onResponse(ActionListenerImplementations.java:245) ~[elasticsearch-8.15.0.jar:?]
        at org.elasticsearch.action.support.ThreadedActionListener$1.doRun(ThreadedActionListener.java:39) ~[elasticsearch-8.15.0.jar:?]
        at org.elasticsearch.common.util.concurrent.ThreadContext$ContextPreservingAbstractRunnable.doRun(ThreadContext.java:984) ~[elasticsearch-8.15.0.jar:?]
        at org.elasticsearch.common.util.concurrent.AbstractRunnable.run(AbstractRunnable.java:26) ~[elasticsearch-8.15.0.jar:?]
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) ~[?:?]
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ~[?:?]
        at java.lang.Thread.run(Thread.java:1570) ~[?:?]
[2025-02-02T01:20:04,395][ERROR][o.e.x.c.t.IndexTemplateRegistry] [node-2] error adding ingest pipeline template [behavioral_analytics-events-final_pipeline] for [enterprise_search]
java.lang.IllegalStateException: Ingest info is empty
        at org.elasticsearch.ingest.IngestService.validatePipeline(IngestService.java:648) ~[elasticsearch-8.15.0.jar:?]
        at org.elasticsearch.ingest.IngestService.validatePipelineRequest(IngestService.java:465) ~[elasticsearch-8.15.0.jar:?]
        at org.elasticsearch.ingest.IngestService.lambda$putPipeline$5(IngestService.java:448) ~[elasticsearch-8.15.0.jar:?]
        at org.elasticsearch.action.ActionListenerImplementations$ResponseWrappingActionListener.onResponse(ActionListenerImplementations.java:245) ~[elasticsearch-8.15.0.jar:?]
        at org.elasticsearch.action.support.ContextPreservingActionListener.onResponse(ContextPreservingActionListener.java:32) ~[elasticsearch-8.15.0.jar:?]
        at org.elasticsearch.tasks.TaskManager$1.onResponse(TaskManager.java:202) ~[elasticsearch-8.15.0.jar:?]
        at org.elasticsearch.tasks.TaskManager$1.onResponse(TaskManager.java:196) ~[elasticsearch-8.15.0.jar:?]
        at org.elasticsearch.action.ActionListenerImplementations$RunBeforeActionListener.onResponse(ActionListenerImplementations.java:307) ~[elasticsearch-8.15.0.jar:?]
        at org.elasticsearch.action.support.ContextPreservingActionListener.onResponse(ContextPreservingActionListener.java:32) ~[elasticsearch-8.15.0.jar:?]
        at org.elasticsearch.action.ActionListenerImplementations$MappedActionListener.onResponse(ActionListenerImplementations.java:95) ~[elasticsearch-8.15.0.jar:?]
        at org.elasticsearch.action.ActionListener.respondAndRelease(ActionListener.java:367) ~[elasticsearch-8.15.0.jar:?]
        at org.elasticsearch.action.support.nodes.TransportNodesAction.lambda$newResponseAsync$2(TransportNodesAction.java:215) ~[elasticsearch-8.15.0.jar:?]
        at org.elasticsearch.action.ActionListener.run(ActionListener.java:444) ~[elasticsearch-8.15.0.jar:?]
        at org.elasticsearch.action.support.nodes.TransportNodesAction.newResponseAsync(TransportNodesAction.java:215) ~[elasticsearch-8.15.0.jar:?]
        at org.elasticsearch.action.support.nodes.TransportNodesAction$1.lambda$onCompletion$4(TransportNodesAction.java:166) ~[elasticsearch-8.15.0.jar:?]
        at org.elasticsearch.action.support.nodes.TransportNodesAction.lambda$doExecute$0(TransportNodesAction.java:178) ~[elasticsearch-8.15.0.jar:?]
        at org.elasticsearch.action.ActionListenerImplementations$ResponseWrappingActionListener.onResponse(ActionListenerImplementations.java:245) ~[elasticsearch-8.15.0.jar:?]
        at org.elasticsearch.action.support.ThreadedActionListener$1.doRun(ThreadedActionListener.java:39) ~[elasticsearch-8.15.0.jar:?]
        at org.elasticsearch.common.util.concurrent.ThreadContext$ContextPreservingAbstractRunnable.doRun(ThreadContext.java:984) ~[elasticsearch-8.15.0.jar:?]
        at org.elasticsearch.common.util.concurrent.AbstractRunnable.run(AbstractRunnable.java:26) ~[elasticsearch-8.15.0.jar:?]
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) ~[?:?]
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ~[?:?]
        at java.lang.Thread.run(Thread.java:1570) ~[?:?]
[2025-02-02T01:20:04,399][ERROR][o.e.x.c.t.IndexTemplateRegistry] [node-2] error adding ingest pipeline template [search-default-ingestion] for [enterprise_search]
java.lang.IllegalStateException: Ingest info is empty
        at org.elasticsearch.ingest.IngestService.validatePipeline(IngestService.java:648) ~[elasticsearch-8.15.0.jar:?]
        at org.elasticsearch.ingest.IngestService.validatePipelineRequest(IngestService.java:465) ~[elasticsearch-8.15.0.jar:?]
        at org.elasticsearch.ingest.IngestService.lambda$putPipeline$5(IngestService.java:448) ~[elasticsearch-8.15.0.jar:?]
        at org.elasticsearch.action.ActionListenerImplementations$ResponseWrappingActionListener.onResponse(ActionListenerImplementations.java:245) ~[elasticsearch-8.15.0.jar:?]
        at org.elasticsearch.action.support.ContextPreservingActionListener.onResponse(ContextPreservingActionListener.java:32) ~[elasticsearch-8.15.0.jar:?]
        at org.elasticsearch.tasks.TaskManager$1.onResponse(TaskManager.java:202) ~[elasticsearch-8.15.0.jar:?]
        at org.elasticsearch.tasks.TaskManager$1.onResponse(TaskManager.java:196) ~[elasticsearch-8.15.0.jar:?]
        at org.elasticsearch.action.ActionListenerImplementations$RunBeforeActionListener.onResponse(ActionListenerImplementations.java:307) ~[elasticsearch-8.15.0.jar:?]
        at org.elasticsearch.action.support.ContextPreservingActionListener.onResponse(ContextPreservingActionListener.java:32) ~[elasticsearch-8.15.0.jar:?]
        at org.elasticsearch.action.ActionListenerImplementations$MappedActionListener.onResponse(ActionListenerImplementations.java:95) ~[elasticsearch-8.15.0.jar:?]
        at org.elasticsearch.action.ActionListener.respondAndRelease(ActionListener.java:367) ~[elasticsearch-8.15.0.jar:?]
        at org.elasticsearch.action.support.nodes.TransportNodesAction.lambda$newResponseAsync$2(TransportNodesAction.java:215) ~[elasticsearch-8.15.0.jar:?]
        at org.elasticsearch.action.ActionListener.run(ActionListener.java:444) ~[elasticsearch-8.15.0.jar:?]
        at org.elasticsearch.action.support.nodes.TransportNodesAction.newResponseAsync(TransportNodesAction.java:215) ~[elasticsearch-8.15.0.jar:?]
        at org.elasticsearch.action.support.nodes.TransportNodesAction$1.lambda$onCompletion$4(TransportNodesAction.java:166) ~[elasticsearch-8.15.0.jar:?]
        at org.elasticsearch.action.support.nodes.TransportNodesAction.lambda$doExecute$0(TransportNodesAction.java:178) ~[elasticsearch-8.15.0.jar:?]
        at org.elasticsearch.action.ActionListenerImplementations$ResponseWrappingActionListener.onResponse(ActionListenerImplementations.java:245) ~[elasticsearch-8.15.0.jar:?]
        at org.elasticsearch.action.support.ThreadedActionListener$1.doRun(ThreadedActionListener.java:39) ~[elasticsearch-8.15.0.jar:?]
        at org.elasticsearch.common.util.concurrent.ThreadContext$ContextPreservingAbstractRunnable.doRun(ThreadContext.java:984) ~[elasticsearch-8.15.0.jar:?]
        at org.elasticsearch.common.util.concurrent.AbstractRunnable.run(AbstractRunnable.java:26) ~[elasticsearch-8.15.0.jar:?]
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) ~[?:?]
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ~[?:?]
        at java.lang.Thread.run(Thread.java:1570) ~[?:?]
EOF

7.查看索引(数据流)是否创建成功

登陆进行kibana。登陆地址:http://你的kibana所在服务器IP:5601/


点击左侧菜单的"Stack Management"选项


点击"Stack Management "之下的"索引管理 " 之下的"数据流 "。

可以看到我们的配置文件中指定的索引,已经变成了 对应的数据流。


8.创建数据视图:

菜单位置:Stack Management --> 数据视图

bash 复制代码
#创建数据视图时,内容可以填下面这几个。
名称:软件日志
索引模式:log-software*
时间戳字段:@timestamp

:上面这个索引模式你可以根据实际需求来,例如你也可以填log-software-nginx-access-*。这样创建出来的数据视图就只有nginx-access日志的。上面log-software*匹配的是所有的软件日志,即匹配下图的这三个数据流。

9.查看数据视图

菜单位置:Discover


检查es多行异常是否被合成一行:

效果如下,一个异常的多行内容被合在一行了。

10.使用KQL对采集的日志内容进行过滤

KQL语法:

示例1:过滤出ES日志:

查询条件:

bash 复制代码
tags : "elasticsearch-log"

示例2:过滤出ES日志,并且过滤出存在error的日志

查询条件:

bash 复制代码
tags : "elasticsearch-log" and message:error

11.给日志数据配置保留天数(扩展知识)

我们在上面的Filebeat配置文件指定了一个名为"log"的索引模板(setup.template.name),默认通过该索引模板创建出来的索引、数据流 日志数据是永久保留的,故而我们需要更改索引模板里的数据保留天数(界面方式)

修改索引模板


把索引模板"log "里面的"数据保留时间 "进行开启,然后设置对应的数据保留天数 ,然后一直点下一步即可。


重新把之前自动创建的几个"数据流"给删掉。

在filebeat上执行 rm -rf data/ * ,然后启动filebeat重新采集日志文件数据,此时会自动创建 "索引数据流",记得把filebeat配置文件的"setup.template.overwrite: false"索引模板重写设置为fasle。

重新访问kibana的索引管理,发现自动创建出来的数据流已经有"数据保留时间 "了,并且保留时间和我们在索引模板里面指定的保留天数一致。效果如下:

至此,大功告成~~~~

相关推荐
Aric_Jones4 小时前
lua入门语法,包含安装,注释,变量,循环等
java·开发语言·git·elasticsearch·junit·lua
Kakaxiii15 小时前
【2025最新】gitee+pycharm完成项目的上传与管理
elasticsearch·pycharm·gitee
不学会Ⅳ1 天前
【吃透 Elasticsearch 的核心原理】学习步骤
大数据·学习·elasticsearch
完美世界的一天2 天前
ES面试题系列「一」
大数据·elasticsearch·搜索引擎·面试·全文检索
好吃的肘子2 天前
ElasticSearch入门详解
java·大数据·elasticsearch·搜索引擎·云原生
极小狐2 天前
如何从极狐GitLab 容器镜像库中删除容器镜像?
java·linux·开发语言·数据库·python·elasticsearch·gitlab
A-花开堪折3 天前
RK3568-OpenHarmony(1) : OpenHarmony 5.1的编译
大数据·elasticsearch·搜索引擎
斯普信专业组3 天前
Elasticsearch内存管理与JVM优化:原理剖析与最佳实践
大数据·jvm·elasticsearch
Lw老王要学习3 天前
Linux架构篇、第四章_ELK与EFK-7.17.9的日志管理
linux·运维·elk·架构·云计算
SelectDB技术团队3 天前
可观测性方案怎么选?SelectDB vs Elasticsearch vs ClickHouse
大数据·数据仓库·clickhouse·elasticsearch·信息可视化·doris·半结构化