尝试将数据从日志存储发送到弹性云时收到"Empty configuration for pipeline_id"错误



附加信息:

  • 日志版本:6.3.1
  • 操作系统: macOS 10.13.4
  • 弹性
  • 搜索:弹性云上的 6.2.24
  • Kibana:6.2.24 在 Elastic Cloud 上

问题:

你好

我正在尝试将数据从日志发送到弹性云,但在日志存储运行时收到以下错误:

"Empty configuration for pipeline_id: artist_profile_views"

但是,如果我尝试运行 logstash 而不在 logststah.yml 中定义 xpack 配置,这工作正常,并且 stdout {} 将收集的数据发送到输出。

请查看我的配置文件:

logststash.yml

cloud.id: "[...]"
cloud.auth: "[user]:[password]"
xpack.monitoring.enabled: true
xpack.monitoring.elasticsearch.url: https://elasticCloudUrl
xpack.monitoring.elasticsearch.username: [user]
xpack.monitoring.elasticsearch.password: [password]
xpack.management.enabled: true
xpack.management.pipeline.id: ["artist_profile_views", "searched"]
xpack.management.elasticsearch.username: [user]
xpack.management.elasticsearch.password: [password]
xpack.management.elasticsearch.url: ["https://elasticCloudUrl"]

管道.yml

- pipeline.id: artist_profile_views
path.config: "pipelines/artist_profile_views.conf"
- pipeline.id: searched
path.config: "pipelines/searched.conf"

artist_profile_views

input { 
file {
path => "/Users/zabaala/Sites/cna/stats/artist_profile_views/artist_profile_views_*.log"
codec => json
start_position => "beginning"
}
}
filter {
geoip {
source => "[ip]"
}   
useragent {
source => "[headers][user_agent]"
target => "[headers][request]"
}
mutate {
remove_field => ["[headers][user_agent]"]
}
}

output {
elasticsearch { 
# hosts => ["https://ElasticCloudUrl"] 
index => "stats"
}
stdout { 
codec => rubydebug
}
}

示例数据:

{"artist_profile_views":{"id":"510","type":"ARTIST","area":"PHOTOS"},"env":"local","ip":"172.18.0.1","index":"stats","doc":"artist_profile_views","when":{"date":"2018-07-06T17:20:48-0300"},"viewer":{"id":null,"context":"GUEST","by_himself":true},"headers":{"user_agent":"Mozilla/5.0 (Macintosh; Intel Mac OS X 10_13_4) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/67.0.3396.99 Safari/537.36"}}
{"artist_profile_views":{"id":"510","type":"ARTIST","area":"EVENTS"},"env":"local","ip":"172.18.0.1","index":"stats","doc":"artist_profile_views","when":{"date":"2018-07-06T17:20:50-0300"},"viewer":{"id":null,"context":"GUEST","by_himself":true},"headers":{"user_agent":"Mozilla/5.0 (Macintosh; Intel Mac OS X 10_13_4) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/67.0.3396.99 Safari/537.36"}}
{"artist_profile_views":{"id":"510","type":"ARTIST","area":"AREA_AUDIOS"},"env":"local","ip":"172.18.0.1","index":"stats","doc":"artist_profile_views","when":{"date":"2018-07-06T17:20:52-0300"},"viewer":{"id":null,"context":"GUEST","by_himself":true},"headers":{"user_agent":"Mozilla/5.0 (Macintosh; Intel Mac OS X 10_13_4) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/67.0.3396.99 Safari/537.36"}}
{"artist_profile_views":{"id":"510","type":"ARTIST","area":"VIDEOS"},"env":"local","ip":"172.18.0.1","index":"stats","doc":"artist_profile_views","when":{"date":"2018-07-06T17:20:55-0300"},"viewer":{"id":null,"context":"GUEST","by_himself":true},"headers":{"user_agent":"Mozilla/5.0 (Macintosh; Intel Mac OS X 10_13_4) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/67.0.3396.99 Safari/537.36"}}
{"artist_profile_views":{"id":"510","type":"ARTIST","area":"HOME"},"env":"local","ip":"172.18.0.1","index":"stats","doc":"artist_profile_views","when":{"date":"2018-07-06T17:31:32-0300"},"viewer":{"id":null,"context":"GUEST","by_himself":true},"headers":{"user_agent":"Mozilla/5.0 (Macintosh; Intel Mac OS X 10_13_4) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/67.0.3396.99 Safari/537.36"}}
{"artist_profile_views":{"id":"510","type":"ARTIST","area":"AREA_AUDIOS"},"env":"local","ip":"172.18.0.1","index":"stats","doc":"artist_profile_views","when":{"date":"2018-07-06T17:31:43-0300"},"viewer":{"id":null,"context":"GUEST","by_himself":true},"headers":{"user_agent":"Mozilla/5.0 (Macintosh; Intel Mac OS X 10_13_4) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/67.0.3396.99 Safari/537.36"}}

怎么了?

谢谢。

我找到了答案。

出现此问题的原因是配置 xpack.managed.* 使用集中式管道功能。

使用集中式管道,无需创建任何 pipeline.yml 文件即可配置本地 logstash 管道。使用集中式管道,您可以将自己的管道放入 elasticsearch 中。这是 xpack 的资源。

当您将日志存储配置为由 elasticsearch 管理时,管道将按 id 从日志存储加载。

有关集中式管道的更多信息,请参阅:https://www.elastic.co/guide/en/logstash/current/logstash-centralized-pipeline-management.html

相关内容

  • 没有找到相关文章