采集多个目录日志,本身的配置:php
- type: log enabled: true # Paths that should be crawled and fetched. Glob based paths. paths: - /opt/nginx-1.14.0/logs/stars/star.access.log #指明读取的日志文件的位置 tags: ["nginx-access"] #使用tag来区分不一样的日志 - type: log enabled: true # Paths that should be crawled and fetched. Glob based paths. paths: - /srv/www/runtime/logs/app.log tags: ["app"] #使用tag来区分不一样的日志
收集日志的时候,若是是像本身的项目日志,每每是多行trace日志,这个时候,就须要配置多行匹配,filebeat提供了multiline ooptions用来解析多行日志合并成一行。html
multiline options 主要是三个主要参数:nginx
multiline.pattern: '^[0-9]{4}-[0-9]{2}-[0-9]{2}' multiline.negate: true multiline.match: after //上面配置的意思是:不以时间格式开头的行都合并到上一行的末尾(正则写的很差,忽略忽略)
因此,最终,想要配置收集不一样目录的日志,而且多行日志匹配,具体配置为:正则表达式
- type: log enabled: true # Paths that should be crawled and fetched. Glob based paths. paths: - /opt/nginx-1.14.0/logs/stars/star.access.log tags: ["nginx-access"] - type: log enabled: true # Paths that should be crawled and fetched. Glob based paths. paths: - /srv/www/runtime/logs/app.log tags: ["app"] multiline.pattern: '^[0-9]{4}-[0-9]{2}-[0-9]{2}' multiline.negate: true multiline.match: after
关键在于使用filelds中的type进行不一样日志的区分session
filebeat服务重启:systemctl restart filebeatapp
参考连接:elasticsearch
https://www.cnblogs.com/zhjh2...oop
https://blog.csdn.net/m0_3788...fetch
logstash配置文件,对不一样的日志进行不一样的filter匹配ui
#Logstash经过type字段进行判断 input { beats { host => '0.0.0.0' port => 5401 } } filter { if "nginx-access" in [tags]{ #对nginx-access进行匹配 grok { #grok插件对日志进行匹配,主要是正则表达式 match => { "message" => "%{IPORHOST:remote_ip} - %{IPORHOST:host} - \[%{HTTPDATE:access_time}\] \"%{WORD:http_method} %{DATA:url} HTTP/%{NUMBER:http_version}\" - %{DATA:request_body} - %{INT:http_status} %{INT:body_bytes_sent} \"%{DATA:refer}\" \"%{DATA:user_agnet}\" \"%{DATA:x_forwarded_for}\" \"%{DATA:upstream_addr}\" \"response_location:%{DATA:response_location}\"" } } } else if "app" in [tags]{ grok { match => { "message" => "%{DATESTAMP:log_time} \[%{IP:remote_ip}\]\[%{INT:uid}\]\[%{DATA:session_id}\]\[%{WORD:log_level}\]\[%{DATA:category}\] %{GREEDYDATA:message_text}" } } } output{ if "nginx-access" in [tags]{ elasticsearch { hosts => ["http://xxx.xxx.xxx.xx:9200"] index => "star_nginx_access_index_pattern-%{+YYYY.MM.dd}" user => "elastic" password => "!@#j3C" } } else if "app" in [tags]{ elasticsearch { hosts => ["http://xxx.xxx.xxx.xx:9200"] index => "star_app_index_pattern-%{+YYYY.MM.dd}" user => "elastic" password => "!@#j3C" } } }
主要是对grok的正则表达式进行匹配,为了将日志一条一条的匹配出来而后在Kibana中展现
重启logstash:systemctl restart logstash
使用grok在线调试校验:http://grokdebug.herokuapp.com/