KSQL 中的 JSON 到 AVRO 反序列化错误:由于反序列化错误而跳过记录



我在 AWS 上建立了一个融合平台。我的源代码是MySql,我已经使用debezium连接器将其连接到Kafka connect。源中的数据格式为 JSON。现在在 KSQL 中,我创建了一个派生主题并将 JSON 主题转换为 AVRO,以使数据能够使用 JDBC 连接器沉入 MYSQL。我使用了以下查询:

CREATE STREAM json_stream (userId int, auth_id varchar, email varchar) WITH (KAFKA_TOPIC='test', VALUE_FORMAT='JSON');

派生主题:

create TABLE avro_stream WITH (VALUE_FORMAT='AVRO') AS select * from json_stream;

我尝试直接使用 JSON 消息来接收到 mysql,但它失败了,因为连接器需要架构,因此带有架构的 JSON 或 Avro 消息将帮助我沉没数据。

从主题avro_stream消费时:

 [2019-07-09 13:27:30,239] WARN task [0_3] Skipping record due to
 deserialization error. topic=[avro_stream] partition=[3] offset=[144]
 (org.apache.kafka.streams.processor.internals.RecordDeserializer:86)
 org.apache.kafka.connect.errors.DataException: avro_stream     at
 io.confluent.connect.avro.AvroConverter.toConnectData(AvroConverter.java:97)
    at
 io.confluent.ksql.serde.connect.KsqlConnectDeserializer.deserialize(KsqlConnectDeserializer.java:44)
    at
 io.confluent.ksql.serde.connect.KsqlConnectDeserializer.deserialize(KsqlConnectDeserializer.java:26)
    at
 org.apache.kafka.common.serialization.ExtendedDeserializer$Wrapper.deserialize(ExtendedDeserializer.java:65)
    at
 org.apache.kafka.common.serialization.ExtendedDeserializer$Wrapper.deserialize(ExtendedDeserializer.java:55)
    at
 org.apache.kafka.streams.processor.internals.SourceNode.deserializeValue(SourceNode.java:63)
    at
 org.apache.kafka.streams.processor.internals.RecordDeserializer.deserialize(RecordDeserializer.java:66)
    at
 org.apache.kafka.streams.processor.internals.RecordQueue.addRawRecords(RecordQueue.java:97)
    at
 org.apache.kafka.streams.processor.internals.PartitionGroup.addRawRecords(PartitionGroup.java:117)
    at
 org.apache.kafka.streams.processor.internals.StreamTask.addRecords(StreamTask.java:638)
    at
 org.apache.kafka.streams.processor.internals.StreamThread.addRecordsToTasks(StreamThread.java:936)
    at
 org.apache.kafka.streams.processor.internals.StreamThread.runOnce(StreamThread.java:831)
    at
 org.apache.kafka.streams.processor.internals.StreamThread.runLoop(StreamThread.java:767)
    at
 org.apache.kafka.streams.processor.internals.StreamThread.run(StreamThread.java:736)
 Caused by: org.apache.kafka.common.errors.SerializationException:
 Error deserializing Avro message for id -1 Caused by:
 org.apache.kafka.common.errors.SerializationException: Unknown magic
 byte!

我的 debezium 连接器配置:

{
"name": "debezium-connector",
"config": {
    "connector.class": "io.debezium.connector.mysql.MySqlConnector",
    "database.user": "XXXXX",
    "auto.create.topics.enable": "true",
    "database.server.id": "1",
    "tasks.max": "1",
    "database.history.kafka.bootstrap.servers": "X.X.X.X:9092",,
    "database.history.kafka.topic": "XXXXXXX",
    "transforms": "unwrap",
    "database.server.name": "XX-server",
    "database.port": "3306",
    "include.schema.changes": "true",
    "table.whitelist": "XXXX.XXXX",
    "key.converter.schemas.enable": "false",
    "value.converter.schema.registry.url": "http://localhost:8081",
    "database.hostname": "X.X.X.X",
    "database.password": "xxxxxxx",
    "value.converter.schemas.enable": "false",
    "name": "debezium-connector",
    "transforms.unwrap.type": "io.debezium.transforms.UnwrapFromEnvelope",
    "value.converter": "org.apache.kafka.connect.json.JsonConverter",
    "database.whitelist": "XXXXX",
    "key.converter": "org.apache.kafka.connect.json.JsonConverter"
},
"tasks": [
    {
        "connector": "debezium-connector",
        "task": 0
    }
],
"type": "source"

}

KSQL 将键写入STRING,因此当您使用 Avro 进行值序列化时,键不是。因此,需要按以下方式配置接收器工作线程:

"key.converter": "org.apache.kafka.connect.storage.StringConverter"
"value.converter": "io.confluent.connect.avro.AvroConverter",
"value.converter.schema.registry.url": "<url to schema registry>",

如果已将辅助角色本身配置为使用 Avro,则可以仅覆盖连接器配置的key.converter

相关内容

  • 没有找到相关文章

最新更新