Kafka消息没有插入到postgresql数据库中。我可以看到消费者中的消息,但它没有插入到表中。任何建议都会有帮助。
Sink_connect.properties
connection.url=jdbc:postgresql://localhost:5432/postgres
user=postgres
password=xxxxxx
insert.mode=insert
table.name.format=kafka_sink_pg
pk.mode=none
pk.fields=none
auto.create=true
生产者
kafka-avro-console-producer --broker-list localhost:9092 --topic Kafka_pg --property value.schema='{"type":"record","name":"kafka_sink_pg","fields":[{"name":"serial_no","type":"int"},{"name":"technology", "type": "string"}, {"name":"platform", "type": "string"}]}'
消息
{"serial_no": 1, "technology": "ETL", "platform": "Informatica"}
{"serial_no": 2, "technology": "ETL", "platform": "Talend"}
以下是日志文件中的错误消息
[2020-08-12 03:50:09,940] INFO Kafka Connect started (org.apache.kafka.connect.runtime.Connect:57)
[2020-08-12 03:50:09,943] ERROR Failed to create job for ../config/sink-quickstart-Postgres.properties (org.apache.kafka.connect.cli.ConnectStandalone:110)
[2020-08-12 03:50:09,952] ERROR Stopping after connector error (org.apache.kafka.connect.cli.ConnectStandalone:121)
java.util.concurrent.ExecutionException: org.apache.kafka.connect.errors.ConnectException: Failed to find any class that implements Connector and which name matches io.confluent.connect.jdbc.JdbcSinkConnector
错误是因为jdbc驱动程序的插件路径无法识别位置。通过在connect-avro-standalone.properties
文件中提供插件的完整路径解决了问题
初始
plugin.path=share/java
已更改为
plugin.path=/usr/kafka/share/java #Provided the complete path