Kafka连接器";无法连接到服务器"-将kafka连接到合流云的连接工人码头化



我下面的例子与本文类似:

https://rmoff.net/2019/11/12/running-dockerised-kafka-connect-worker-on-gcp/

除了我不是在GCP上运行kafka连接工作程序,而是在本地运行。

一切都很好,我运行docker compose,kafka connect启动,但当我试图通过CURL创建源连接器的实例时,我得到了以下不明确的消息(注意:kafka connection日志中实际上没有输出日志(:

{"error_code":400,"message":"Connector configuration is invalid and contains the following 1 error(s):nUnable to connect to the server.nYou can also find the above list of errors at the endpoint `/{connectorType}/config/validate`"}

我知道我可以连接到融合云,因为我看到有一些主题正在创建:

docker-connect-configs  
docker-connect-offsets  
docker-connect-status  

我的docker-compose.yml看起来是这样的:

---
version: '2'
services:

kafka-connect-01:
image: confluentinc/cp-kafka-connect:5.4.0
container_name: kafka-connect-01
restart: always
depends_on:
# - zookeeper
# - kafka
- schema-registry
ports:
- 8083:8083
environment:
CONNECT_LOG4J_APPENDER_STDOUT_LAYOUT_CONVERSIONPATTERN: "[%d] %p %X{connector.context}%m (%c:%L)%n"
CONNECT_BOOTSTRAP_SERVERS: "my-server-name.confluent.cloud:9092"
CONNECT_REST_PORT: 8083
CONNECT_REST_ADVERTISED_HOST_NAME: "kafka-connect-01"
CONNECT_GROUP_ID: compose-connect-group
CONNECT_CONFIG_STORAGE_TOPIC: docker-connect-configs
CONNECT_OFFSET_STORAGE_TOPIC: docker-connect-offsets
CONNECT_STATUS_STORAGE_TOPIC: docker-connect-status
#CONNECT_KEY_CONVERTER: io.confluent.connect.avro.AvroConverter
CONNECT_KEY_CONVERTER_SCHEMA_REGISTRY_URL: 'http://my-server-name.confluent.cloud:8081'
#CONNECT_VALUE_CONVERTER: io.confluent.connect.avro.AvroConverter
CONNECT_VALUE_CONVERTER_SCHEMA_REGISTRY_URL: 'http://my-server-name.confluent.cloud:8081'
CONNECT_INTERNAL_KEY_CONVERTER: "org.apache.kafka.connect.json.JsonConverter"
CONNECT_INTERNAL_VALUE_CONVERTER: "org.apache.kafka.connect.json.JsonConverter"
CONNECT_LOG4J_ROOT_LOGLEVEL: "INFO"
CONNECT_LOG4J_LOGGERS: "org.apache.kafka.connect.runtime.rest=WARN,org.reflections=ERROR"
CONNECT_REPLICATION_FACTOR: "3"
CONNECT_CONFIG_STORAGE_REPLICATION_FACTOR: "3"
CONNECT_OFFSET_STORAGE_REPLICATION_FACTOR: "3"
CONNECT_STATUS_STORAGE_REPLICATION_FACTOR: "3"
CONNECT_PLUGIN_PATH: '/usr/share/java'
CONNECT_KEY_CONVERTER: "org.apache.kafka.connect.json.JsonConverter"
CONNECT_VALUE_CONVERTER: "org.apache.kafka.connect.json.JsonConverter"
#ENV VARS FOR CCLOUD CONNECTION
CONNECT_SSL_ENDPOINT_IDENTIFICATION_ALGORITHM: "https"
CONNECT_SASL_MECHANISM: PLAIN
CONNECT_SECURITY_PROTOCOL: SASL_SSL
CONNECT_SASL_JAAS_CONFIG: "${SASL_JAAS_CONFIG}"
CONNECT_CONSUMER_SECURITY_PROTOCOL: SASL_SSL
CONNECT_CONSUMER_SSL_ENDPOINT_IDENTIFICATION_ALGORITHM: https
CONNECT_CONSUMER_SASL_MECHANISM: PLAIN
CONNECT_CONSUMER_SASL_JAAS_CONFIG: "${SASL_JAAS_CONFIG}"
CONNECT_PRODUCER_SECURITY_PROTOCOL: SASL_SSL
CONNECT_PRODUCER_SSL_ENDPOINT_IDENTIFICATION_ALGORITHM: https
CONNECT_PRODUCER_SASL_MECHANISM: PLAIN
CONNECT_PRODUCER_SASL_JAAS_CONFIG: "${SASL_JAAS_CONFIG}"

volumes:
- db-leach:/db-leach/
- $PWD/connectors:/usr/share/java/kafka-connect-jdbc/jars/
command: 
- /bin/bash
- -c 

我已经运行了mongo实例,我想创建mongo源连接器,这是我的CURL请求:

curl -X PUT http://localhost:8083/connectors/my-mongo-source-connector/config -H "Content-Type: application/json" -d '{
"tasks.max":"1",
"connector.class":"com.mongodb.kafka.connect.MongoSourceConnector",
"connection.uri":"mongodb://mongo1:27017,mongo2:27017,mongo3:27017",
"topic.prefix":"topic.prefix",
"topic.suffix":"mySuffix",
"database":"myMongoDB",
"collection":"myMongoCollection",
"copy.existing": "true",
"output.format.key": "json",
"output.format.value": "json",
"change.stream.full.document": "updateLookup",
"publish.full.document.only": "false",
"confluent.topic.bootstrap.servers" : "'${CCLOUD_BROKER_HOST}':9092", 
"confluent.topic.sasl.jaas.config" : "org.apache.kafka.common.security.plain.PlainLoginModule required username="'${CCLOUD_API_KEY}'" password="'${CCLOUD_API_SECRET}'";", 
"confluent.topic.security.protocol": "SASL_SSL", 
"confluent.topic.ssl.endpoint.identification.algorithm": "https", 
"confluent.topic.sasl.mechanism": "PLAIN" 
}';

我错过了什么?

我设法让它工作起来,这是一个正确的配置。。。

消息";无法连接到服务器";是因为我错误地部署了mongo实例,所以它与kafka连接或融合云无关。

如果将来有人在这方面遇到困难,我将把这个问题作为一个例子。我花了一段时间才弄清楚如何为连接到融合云的kafka连接配置docker compose。

最新更新