融合kafka-python SSL验证



我正在使用合流的kafka-python'https://github.com/confluentinc/confluent-kafka-python'用于编写申请。kafka和schema注册表都是安全的,并使用https端点。

在运行应用程序时,我收到以下错误

Result: Failure Exception: SSLError: HTTPSConnectionPool(host='hostname', port=443): 
Max retries exceeded with url: //subjects/schema-value/versions (Caused by SSLError(SSLError("bad handshake: Error([('SSL routines', 'tls_process_server_certificate', 'certificate verify failed')])"))) 

问题1:

For connecting to schema registry, where to specify the ceritificate value ?

问题2:

For testing, i want to disable SSL verification in python, What is the option to do that ?

提前谢谢。

这是我为avro生产者使用的配置:

avro_producer_conf = {
"bootstrap.servers": "SSL://127.0.0.1:9094",
"security.protocol": "ssl",
# Certificates used by simple Producer
"ssl.ca.location": "/ssl/root/intermediate/ca-chain.cert.pem",
"ssl.certificate.location": "/ssl/root/intermediate/producer/producer.cert.pem",
"ssl.key.location": "/ssl/root/intermediate/producer/producer.key.pem",
'schema.registry.url': "https://schemaregistry:8081",
# Certificates used by Schema Registry
"schema.registry.ssl.ca.location": "/ssl/root/intermediate/ca-chain.cert.pem",
"schema.registry.ssl.certificate.location": "/ssl/root/intermediate/producer/producer.cert.pem",
"schema.registry.ssl.key.location": "/ssl/root/intermediate/producer/producer.key.pem"
}

AvroProducer__init__()方法正在进行参数分离。要传递给SchemaRegistry的所有内容都需要从schema.registry.<parameter>开始。要将SSL与架构注册表一起使用,请确保使用未加密的密钥(不带密码的私钥(。请确保您没有设置REQUESTS_CA_BUNDLE环境变量,这会混淆库。

最新更新