如何解决Kafka Connect JSONConverter "Schema must contain 'type' field"



我试图将消息推送到 JdbcSink,消息如下

{
"schema": {
"type": "struct",
"fields": [{
"field": "ID",
"type": {
"type": "bytes",
"scale": 0,
"precision": 64,
"connect.version": 1,
"connect.parameters": {
"scale": "0"
},
"connect.name": "org.apache.kafka.connect.data.Decimal",
"logicalType": "decimal"
}
}, {
"field": "STORE_DATE",
"type": ["null", {
"type": "long",
"connect.version": 1,
"connect.name": "org.apache.kafka.connect.data.Timestamp",
"logicalType": "timestamp-millis"
}],
"default": null
}, {
"field": "DATA",
"type": ["null", "string"],
"default": null
}],
"name": "KAFKA_STREAM"
},
"payload": {
"ID": 17,
"STORE_DATE": null,
"DATA": "THIS IS TEST DATA"
}
}

但它不断抛出错误Caused by: org.apache.kafka.connect.errors.DataException: Schema must contain 'type' field

这是我当前使用的连接器配置

{
"connector.class": "io.confluent.connect.jdbc.JdbcSinkConnector",
"topics": "DEV_KAFKA_STREAM",
"connection.url": "url",
"connection.user": "user",
"connection.password": "password",
"insert.mode": "insert",
"table.name.format": "KAFKA_STREAM",
"pk.fields": "ID",
"auto.create": "false",
"errors.log.enable": "true",
"errors.log.include.messages": "true",
"value.converter": "org.apache.kafka.connect.json.JsonConverter",
"value.converter.schemas.enable": "true"
}

不确定如何调试它或如何找到根本原因,因为 JSON 确实有type字段

据我所知,"long"不是有效的模式类型。

你想要"int64"

JSON 架构源代码

您可能还想删除工会。有一个optional键来指定可为空的字段

Kafka Connect JDBC 接收器连接器不工作

如果要在 Java 中创建该 JSON,则应在两个 JSONNode 对象周围使用 SchemaBuilder 和 Envelope 类,以确保正确构建有效负载

相关内容

  • 没有找到相关文章

最新更新