我正在关注filebeat->logstash->弹性搜索->基巴纳管道。filebeat成功地工作并从目标文件获取日志。
Logstash接收输入插件上的日志,绕过过滤插件并发送到输出插件。
filebeat.yml
# ============================== Filebeat inputs ===============================
filebeat.inputs:
- type: log
enabled: true
paths:
- D:serverslogsch5shdmtbuil100TeamCity.BuildServer-logslauncher.log
fields:
type: launcherlogs
- type: filestream
# Change to true to enable this input configuration.
enabled: false
# Paths that should be crawled and fetched. Glob based paths.
paths:
- /var/log/*.log
# =================================== Kibana ===================================
setup.kibana:
host: "localhost:5601"
# ------------------------------ Logstash Output -------------------------------
output.logstash:
# The Logstash hosts
hosts: ["localhost:5044"]
logstash.conf
input{
beats{
port => "5044"
}
}
filter {
if [fields][type] == "launcherlogs"{
grok {
match => {"message" =>%{YEAR:year}-%{MONTH:month}-%{MONTHDAY:day}%{DATA:loglevel}%{SPACE}-%{SPACE}%{DATA:class}%{SPACE}-%{GREEDYDATA:message}}
}
}
}
output{
elasticsearch{
hosts => ["http://localhost:9200"]
index => "poclogsindex"
}
}
我可以在kibana上发送日志,但grok调试器脚本无法在kibanas上呈现所需的json。Kibana上呈现的数据json并没有显示脚本中传递的所有属性。请告知。
您的grok模式与您在注释中给出的示例不匹配:缺少几个部分(括号、HH:mm:ss,SSS
部分和一个附加空间(。Grok调试器是你的朋友;-(
代替:
%{YEAR:year}-%{MONTH:month}-%{MONTHDAY:day}%{DATA:loglevel}%{SPACE}-%{SPACE}%{DATA:class}%{SPACE}-%{GREEDYDATA:message}
你的模式应该是:
[%{TIMESTAMP_ISO8601:timestamp}] %{DATA:loglevel}%{SPACE}-%{SPACE}%{DATA:class}%{SPACE}-%{GREEDYDATA:message}
TIMESTAMP_ISO8601
与此日期/时间匹配。
另外,我总是加倍引用模式,所以grok部分是:
grok {
match => {"message" => "[%{TIMESTAMP_ISO8601:timestamp}] %{DATA:loglevel}%{SPACE}-%{SPACE}%{DATA:class}%{SPACE}-%{GREEDYDATA:message}"}
}