如何更改索引中的"消息"值



在logstash管道或indexpattern中如何更改CDN登录的以下部分"消息";字段来分离或提取一些数据,然后对它们进行聚合。

<40> 2022-01-17T08:31:22Z logserver-5 testcdn[1]: {"method":"GET","scheme":"https","domain":"www.123.com","uri":"/product/10809350","ip":"66.249.65.174","ua":"Mozilla/5.0 (Linux; Android 6.0.1; Nexus 5X Build/MMB29P) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/97.0.4692.71 Mobile Safari/537.36 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)","country":"US","asn":15169,"content_type":"text/html; charset=utf-8","status":200,"server_port":443,"bytes_sent":1892,"bytes_received":1371,"upstream_time":0.804,"cache":"MISS","request_id":"b017d78db4652036250148216b0a290c"}

预期变化:

{"method":"GET","scheme":"https","domain":"www.123.com","uri":"/product/10809350","ip":"66.249.65.174","ua":"Mozilla/5.0 (Linux; Android 6.0.1; Nexus 5X Build/MMB29P) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/97.0.4692.71 Mobile Safari/537.36 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)","country":"US","asn":15169,"content_type":"text/html; charset=utf-8","status":200,"server_port":443,"bytes_sent":1892,"bytes_received":1371,"upstream_time":0.804,"cache":"MISS","request_id":"b017d78db4652036250148216b0a290c"}

Bacause this part"><40>2022-01-17T08:31:22Z logserver-5 testcdn[1]:";在jason中没有解析,我无法基于一些文件(如country、asn等(创建可视化仪表板

logstash索引的原始日志是:

{
"_index": "logstash-2022.01.17-000001",
"_type": "_doc",
"_id": "Qx8pZ34BhloLEkDviGxe",
"_version": 1,
"_score": 1,
"_source": {
"message": "<40> 2022-01-17T08:31:22Z logserver-5 testcdn[1]: {"method":"GET","scheme":"https","domain":"www.123.com","uri":"/product/10809350","ip":"66.249.65.174","ua":"Mozilla/5.0 (Linux; Android 6.0.1; Nexus 5X Build/MMB29P) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/97.0.4692.71 Mobile Safari/537.36 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)","country":"US","asn":15169,"content_type":"text/html; charset=utf-8","status":200,"server_port":443,"bytes_sent":1892,"bytes_received":1371,"upstream_time":0.804,"cache":"MISS","request_id":"b017d78db4652036250148216b0a290c"}",
"port": 39278,
"@timestamp": "2022-01-17T08:31:22.100Z",
"@version": "1",
"host": "93.115.150.121"
},
"fields": {
"@timestamp": [
"2022-01-17T08:31:22.100Z"
],
"port": [
39278
],
"@version": [
"1"
],
"host": [
"93.115.150.121"
],
"message": [
"<40> 2022-01-17T08:31:22Z logserver-5 testcdn[1]: {"method":"GET","scheme":"https","domain":"www.123.com","uri":"/product/10809350","ip":"66.249.65.174","ua":"Mozilla/5.0 (Linux; Android 6.0.1; Nexus 5X Build/MMB29P) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/97.0.4692.71 Mobile Safari/537.36 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)","country":"US","asn":15169,"content_type":"text/html; charset=utf-8","status":200,"server_port":443,"bytes_sent":1892,"bytes_received":1371,"upstream_time":0.804,"cache":"MISS","request_id":"b017d78db4652036250148216b0a290c"}"
],
"host.keyword": [
"93.115.150.121"
]
}
}

感谢

谢谢你,这非常有用,我从你对这个特定场景的建议中得到了一个想法:以下经过编辑的logstash.conf解决了这个问题:

input {
tcp {
port => 5000
codec => json
}
}
filter {
grok {
match => { "message" => "%{TIMESTAMP_ISO8601:timestamp} %{GREEDYDATA:Junk}: %{GREEDYDATA:request}"}
}
json { source => "request" }
}
output {
stdout { codec => rubydebug }
elasticsearch {
hosts => ["elasticsearch:9200"]
manage_template => false
ecs_compatibility => disabled
index => "logs-%{[@metadata][beat]}-%{[@metadata][version]}-%{+YYYY.MM.dd}"
}
}

但我主要关心的是编辑配置文件,我更喜欢在kibana-web-ui中进行任何更改,而不是更改logstash.conf,因为我们在组织中的不同场景中使用elk,这样的更改使elk服务器仅适用于特殊用途,而不适用于多用途。如何在不更改logstash配置文件的情况下获得这样的结果?

将这些配置添加到logstash config:的过滤部分

#To parse the message field
grok {
match => { "message" => "<%{NONNEGINT:syslog_pri}>s+%{TIMESTAMP_ISO8601:syslog_timestamp}s+%{DATA:sys_host}s+%{NOTSPACE:sys_module}s+%{GREEDYDATA:syslog_message}"}
}
#To replace message field with syslog_message
mutate {
replace => [ "message", "%{syslog_message}" ]
}

一旦消息字段被syslog_message替换,您就可以添加下面的json过滤器来将json解析为单独的字段。。

json {
source => "syslog_message"
}

相关内容

  • 没有找到相关文章

最新更新