iam通过logstash
发送此日志2017-02-27T13:00:07+01:00 test {"createdAt":"2017-02-27T13:00:07+0100","cluster":"undefined","nodeName":"undefined","nodeIP":"10.11.11.50","clientIP":"10.11.11.72","customerId":1,"identityId":332,"appType":"admin","eventGroup":"education","eventName":"insert","eventData":{"education_insert":{"type":"course","data":{"education_id":2055,"education":{"id":2055,"customer_id":1,"creator_id":332,"type":"course","status":"new","is_featured":false,"enroll_deadline":null,"complete_deadline":null,"count_view":0,"count_like":0,"meta_title":"test Course - progress","meta_description":"test Course - progress","discoverable":"everyone","progress_max":0,"instructor_ids":[332],"tag_ids":[135],"discoverable_group_ids":[],"category_ids":[14],"audits":null,"instructors":null,"creator":null,"lessonGroups":null,"categories":null},"duration":"quick"}}},"scopeType":"education","scopeId":"2055"}
如何删除2017-02-27T13:00:07+01:00
和test.app.event
您需要使用grok
提取消息的JSON部分,然后使用json
过滤器将提取的JSON转换为事件。最后,您需要使用最终事件中不需要的mutate
删除任何字段(例如,message
)。
您可以使用 REGEX 模式仅获取json
。regex
模式应在您的模式文件中:
模式看起来像:
REQUIREDDATA {([^}]*)}([^}]*)([^}]*)}}}([^}]*)} <-- this extracts only your json part
测试的正则模式。
此后,您可以在 grok 匹配中使用图案:
grok {
patterns_dir => ["/pathto/patterns"]
match => { "message" => "^%{REQUIREDDATA:new}" }
}
现在,该消息仅在日志行中具有JSON
零件,因此您现在可以通过logstash
将其推向ES。以上只是一个样本,因此您可以复制。
希望它有帮助!
我用它并为我工作:) thx寻求帮助
input { kafka { bootstrap_servers=>"localhost:9092" topics=>"test"}}
filter{
grok {
match => { "message" => "%{TIMESTAMP_ISO8601:timestamp}t%{GREEDYDATA:topic}t%{GREEDYDATA:json}" }
}
json {
source => "json"
remove_field => ["timestamp","topic","json","message","@version","@timestamp","tags"]
}
}
output{ elasticsearch {hosts=>["127.0.0.1:9200"] document_type=>"app_stats" index=>"test"}}