通过Kafka Connect将Rsyslog连接到Kafka



我有两个配置文件可以使用它们来制作从rsyslog到Kafka主题的流,配置如下:worker.properties e

bootstrap.servers=localhost:9092
key.converter=org.apache.kafka.connect.json.JsonConverter
value.converter=org.apache.kafka.connect.json.JsonConverter
key.converter.schemas.enable=false
value.converter.schemas.enable=false
offset.storage.file.filename=/tmp/connect.offsets
offset.flush.interval.ms=10000
plugin.path=/opt/kafka/plugins

syslog.propertiese

name=syslog-source
tasks.max=3
connector.class=io.confluent.connect.syslog.SyslogSourceConnector
syslog.port=514
syslog.listener=TCP
syslog.listen.address=0.0.0.0
confluent.topic.bootstrap.servers=localhost:9092
#confluent.topic.replication.factor=1
topics=rsyslog

但它不会给Kafak写日志。输出如下:

[2020-03-07 07:23:21,698] INFO WorkerSourceTask{id=syslog-source-0} Source task finished initialization and start (org.apache.kafka.connect.runtime.WorkerSourceTask:209)
[2020-03-07 07:23:31,046] INFO WorkerSourceTask{id=syslog-source-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:416)
[2020-03-07 07:23:31,048] INFO WorkerSourceTask{id=syslog-source-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:433)

有人能帮我知道为什么会发生这种事吗?

经过多次尝试,我发现最好安装rsyslog-kafka包,因为rsyslog本机支持Kafka。

sudo apt install rsyslog-kafka

相关内容

  • 没有找到相关文章

最新更新