如何停止Kafka生产者消息(Debezium+Azure EventHub)



我已经从PostgeSQL将Debezium和Azure Event Hub设置为CDC引擎。与本教程完全相同:https://dev.to/azure/tutorial-set-up-a-change-data-capture-architecture-on-azure-using-debezium-postgres-and-kafka-49h6

一切都很好,直到我改变了一些东西(我不知道我到底改变了什么(。现在,我的kafka连接日志中充斥着以下警告条目,CDC停止了工作。。。

[2022-03-03 08:31:28,694] WARN [dbz-ewldb-connector|task-0] [Producer clientId=connector-producer-dbz-ewldb-connector-0] Got error produce response with correlation id 2027 on topic-partition ewldb-0, retrying (2147481625 attempts left). Error: REQUEST_TIMED_OUT (org.apache.kafka.clients.producer.internals.Sender:616)
[2022-03-03 08:31:28,775] WARN [dbz-cmddb-connector|task-0] [Producer clientId=connector-producer-dbz-cmddb-connector-0] Got error produce response with correlation id 1958 on topic-partition cmddb-0, retrying (2147481694 attempts left). Error: REQUEST_TIMED_OUT (org.apache.kafka.clients.producer.internals.Sender:616)
[2022-03-03 08:31:28,800] WARN [dbz-ewldb-connector|task-0] [Producer clientId=connector-producer-dbz-ewldb-connector-0] Got error produce response with correlation id 2028 on topic-partition ewldb-0, retrying (2147481624 attempts left). Error: REQUEST_TIMED_OUT (org.apache.kafka.clients.producer.internals.Sender:616)
[2022-03-03 08:31:28,880] WARN [dbz-cmddb-connector|task-0] [Producer clientId=connector-producer-dbz-cmddb-connector-0] Got error produce response with correlation id 1959 on topic-partition cmddb-0, retrying (2147481693 attempts left). Error: REQUEST_TIMED_OUT (org.apache.kafka.clients.producer.internals.Sender:616)

即使我删除了Kafka连接器,也会出现这些消息。重新启动kafka和kafka连接没有帮助。如何停止此重试?

唯一有助于解决问题的方法是:

  1. 从Debezium API中删除连接器
  2. 停止Kafka Connect
  3. 删除EventHub
  4. 启动Kafka Connect
  5. 添加来自Debezium API的连接器

要永久更改重新连接的工作方式,请更改生产者的以下参数:

  • producer.reries=10(默认情况下,它设置为超过20亿,导致kafkaconnect.log中的垃圾邮件(

相关内容

  • 没有找到相关文章

最新更新