无法建立与 kerberos 和启用 SASL 的 kafka 生产者的连接



我正在尝试连接到启用了 kerberos 和 SSL 的 kafka 生产者, 这是属性.yml

spring:
autoconfigure:
exclude[0]: org.springframework.boot.autoconfigure.security.servlet.SecurityAutoConfiguration
exclude[1]: org.springframework.boot.actuate.autoconfigure.security.servlet.ManagementWebSecurityAutoConfiguration
kafka:
topics:
- name: SOME_TOPIC
num-partitions: 5
replication-factor: 1
bootstrap-servers:
- xxx:9092
- yyy:9092
- zzz:9092
autoCreateTopics: false
template:
default-topic: SOME_TOPIC
producer:
key-serializer: org.apache.kafka.common.serialization.StringSerializer
value-serializer: org.springframework.kafka.support.serializer.JsonSerializer
properties:
security:
protocol: SASL_SSL
ssl:
enabled:
protocols: TLSv1.2
truststore:
location: C:\resources\truststorecred.jks
password: truststorepass
type: JKS
sasl:
mechanism: GSSAPI
kerberos:
service:
name: kafka

和虚拟机选项,如下所示。

-Djava.security.auth.login.config=C:\jaas.conf -Djava.security.krb5.conf=C:\resources\krb5.ini

Jaas.conf 如下

KafkaClient {
com.sun.security.auth.module.Krb5LoginModule required
useKeyTab=true
storeKey=true
keyTab="C:\resources\serviceacc@xxx.keytab"
principal="serviceacc@xxx.COM"
useTicketCache=true
serviceName="kafka";
};

能够登录到 Kerberos,但立即失败,出现以下异常。

bootstrap.servers = [xxxx.com:9092, yyyy.com:9092, zzzz.com:9092]
client.id = 
connections.max.idle.ms = 300000
metadata.max.age.ms = 300000
metric.reporters = []
metrics.num.samples = 2
metrics.recording.level = INFO
metrics.sample.window.ms = 30000
receive.buffer.bytes = 65536
reconnect.backoff.max.ms = 1000
reconnect.backoff.ms = 50
request.timeout.ms = 120000
retries = 5
retry.backoff.ms = 100
sasl.jaas.config = null
sasl.kerberos.kinit.cmd = /usr/bin/kinit
sasl.kerberos.min.time.before.relogin = 60000
sasl.kerberos.service.name = kafka
sasl.kerberos.ticket.renew.jitter = 0.05
sasl.kerberos.ticket.renew.window.factor = 0.8
sasl.mechanism = GSSAPI
security.protocol = SASL_SSL
send.buffer.bytes = 131072
ssl.cipher.suites = null
ssl.enabled.protocols = [TLSv1.2]
ssl.endpoint.identification.algorithm = null
ssl.key.password = null
ssl.keymanager.algorithm = SunX509
ssl.keystore.location = null
ssl.keystore.password = null
ssl.keystore.type = JKS
ssl.protocol = TLS
ssl.provider = null
ssl.secure.random.implementation = null
ssl.trustmanager.algorithm = PKIX
ssl.truststore.location = C:\resources\truststorecred.jks
ssl.truststore.password = [hidden]
ssl.truststore.type = JKS
2019-12-21 14:56:16.115  INFO 24216 --- [           main] o.a.k.c.s.authenticator.AbstractLogin    : Successfully logged in.
2019-12-21 14:56:16.117  INFO 24216 --- [xxx.COM] o.a.k.c.security.kerberos.KerberosLogin  : [Principal=serviceacc@xxx.COM]: TGT refresh thread started.
2019-12-21 14:56:16.118  INFO 24216 --- [xxx.COM] o.a.k.c.security.kerberos.KerberosLogin  : [Principal=serviceacc@xxx.COM]: TGT valid starting at: Sat Dec 21 14:56:15 IST 2019
2019-12-21 14:56:16.119  INFO 24216 --- [xxx.COM] o.a.k.c.security.kerberos.KerberosLogin  : [Principal=serviceacc@xxx.COM]: TGT expires: Sun Dec 22 00:56:15 IST 2019
2019-12-21 14:56:16.119  INFO 24216 --- [xxx.COM] o.a.k.c.security.kerberos.KerberosLogin  : [Principal=serviceacc@xxx.COM]: TGT refresh sleeping until: Sat Dec 21 23:13:36 IST 2019
2019-12-21 14:56:16.912  INFO 24216 --- [           main] o.a.kafka.common.utils.AppInfoParser     : Kafka version : 1.0.2
2019-12-21 14:56:16.912  INFO 24216 --- [           main] o.a.kafka.common.utils.AppInfoParser     : Kafka commitId : 2a121f7b1d402825
2019-12-21 14:56:22.085  WARN 24216 --- [| adminclient-1] o.a.k.common.network.SslTransportLayer   : Failed to send SSL Close message 
java.io.IOException: An existing connection was forcibly closed by the remote host
at sun.nio.ch.SocketDispatcher.write0(Native Method) ~[na:1.8.0_191]
at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:51) ~[na:1.8.0_191]
at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) ~[na:1.8.0_191]
at sun.nio.ch.IOUtil.write(IOUtil.java:65) ~[na:1.8.0_191]
at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) ~[na:1.8.0_191]
at org.apache.kafka.common.network.SslTransportLayer.flush(SslTransportLayer.java:213) ~[kafka-clients-1.0.2.jar:na]
at org.apache.kafka.common.network.SslTransportLayer.close(SslTransportLayer.java:176) ~[kafka-clients-1.0.2.jar:na]
at org.apache.kafka.common.utils.Utils.closeAll(Utils.java:703) [kafka-clients-1.0.2.jar:na]
at org.apache.kafka.common.network.KafkaChannel.close(KafkaChannel.java:61) [kafka-clients-1.0.2.jar:na]
at org.apache.kafka.common.network.Selector.doClose(Selector.java:741) [kafka-clients-1.0.2.jar:na]
at org.apache.kafka.common.network.Selector.close(Selector.java:729) [kafka-clients-1.0.2.jar:na]
at org.apache.kafka.common.network.Selector.pollSelectionKeys(Selector.java:522) [kafka-clients-1.0.2.jar:na]
at org.apache.kafka.common.network.Selector.poll(Selector.java:412) [kafka-clients-1.0.2.jar:na]
at org.apache.kafka.clients.NetworkClient.poll(NetworkClient.java:460) [kafka-clients-1.0.2.jar:na]
at org.apache.kafka.clients.admin.KafkaAdminClient$AdminClientRunnable.run(KafkaAdminClient.java:1006) [kafka-clients-1.0.2.jar:na]
at java.lang.Thread.run(Thread.java:748) [na:1.8.0_191]
2019-12-21 14:56:22.087  WARN 24216 --- [| adminclient-1] org.apache.kafka.clients.NetworkClient   : [AdminClient clientId=adminclient-1] Connection to node -2 terminated during authentication. This may indicate that authentication failed due to invalid credentials.
2019-12-21 14:56:26.598  WARN 24216 --- [| adminclient-1] o.a.k.common.network.SslTransportLayer   : Failed to send SSL Close message 

帮助将不胜感激。 谢谢

只是一个小小的改变对我有用

security.protocol: SASL_PLAINTEXT

最新更新