FlinkKafkaConsumer010 的实现是不可序列化的错误



我创建了一个基于 Apache Flink 的自定义类。以下是类定义的一些部分:

public class StreamData {
    private StreamExecutionEnvironment env;
    private DataStream<byte[]> data ;
    private Properties properties;
    public StreamData(){
        env = StreamExecutionEnvironment.getExecutionEnvironment();
    }

    public StreamData(StreamExecutionEnvironment e , DataStream<byte[]> d){
    env = e ;
    data = d ;
}
    public StreamData getDataFromESB(String id, int from) {
        final Pattern TOPIC = Pattern.compile(id);
        Properties properties = new Properties();
        properties.setProperty("bootstrap.servers", "localhost:9092");
        properties.setProperty("group.id", Long.toString(System.currentTimeMillis()));
        properties.setProperty("key.deserializer", "org.apache.kafka.common.serialization.StringDeserializer");
        properties.setProperty("value.deserializer", "org.apache.kafka.common.serialization.ByteArrayDeserializer");
        properties.put("metadata.max.age.ms", 30000);
        properties.put("enable.auto.commit", "false");
        if (from == 0)
            properties.setProperty("auto.offset.reset", "earliest");
        else
            properties.setProperty("auto.offset.reset", "latest");

        StreamExecutionEnvironment e = StreamExecutionEnvironment.getExecutionEnvironment();
        DataStream<byte[]> stream = env
                .addSource(new FlinkKafkaConsumer011<>(TOPIC, new AbstractDeserializationSchema<byte[]>() {
                    @Override
                    public byte[] deserialize(byte[] bytes) {
                        return bytes;
                    }
                }, properties));
        return new StreamData(e, stream);
    }
    public void print(){
        data.print() ;
    }
    public void execute() throws Exception {
        env.execute() ;
    }

使用类StreamData,尝试从Apache Kafka获取一些数据并在main函数中打印它们:

StreamData stream = new StreamData();
        stream.getDataFromESB("original_data", 0);
        stream.print();
        stream.execute();

我收到错误:

Exception in thread "main" org.apache.flink.api.common.InvalidProgramException: The implementation of the FlinkKafkaConsumer010 is not serializable. The object probably contains or references non serializable fields.
Caused by: java.io.NotSerializableException: StreamData

正如这里提到的,我认为这是因为getDataFromESB函数中的某些数据类型不可序列化。但是我不知道如何解决问题!

您的 AbstractDeserializationSchema 是一个匿名的内部类,因此包含对不可序列化的外部 StreamData 类的引用。让 StreamData 实现可序列化,或者将架构定义为顶级类。

似乎你在代码中导入了FlinkKafkaConsumer010,但使用的是FlinkKafkaConsumer011。请在 sbt 文件中使用以下依赖项:

"org.apache.flink" %% "flink-connector-kafka-0.11" % flinkVersion

相关内容

  • 没有找到相关文章