Flink 嵌套类到数据流转换错误



我正在使用flink 1.13。我正在尝试通过以下方式将表结果转换为数据流,但不断出现错误。

public class HybridTrial {
public static class Address {
public String street;
public String houseNumber;
public Address() {}
public Address(String street, String houseNumber) {
this.street = street;
this.houseNumber = houseNumber;
}
}
public static class User {
public String name;
public Integer score;
public LocalDateTime event_time;
public Address address;
// default constructor for DataStream API
public User() {}
// fully assigning constructor for Table API
public User(String name, Integer score, LocalDateTime event_time, Address address) {
this.name = name;
this.score = score;
this.event_time = event_time;
this.address = address;
}
}
public static void main(String[] args) throws Exception {
StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();
DataStream<User> dataStream =
env.fromElements(
new User("Alice", 4, LocalDateTime.now(), new Address()),
new User("Bob", 6, LocalDateTime.now(), new Address("NBC", "204")),
new User("Alice", 10, LocalDateTime.now(), new Address("ABC", "1033")))
.assignTimestampsAndWatermarks(
WatermarkStrategy.<User>forBoundedOutOfOrderness(Duration.ofSeconds(60)));
StreamTableEnvironment tableEnv = StreamTableEnvironment.create(env);
Table table =
tableEnv.fromDataStream(
dataStream, Schema.newBuilder().build());
table.printSchema();
Table t = table.select($("*"));
DataStream<User> dsRow = tableEnv.toDataStream(t,User.class);
dsRow.print();
env.execute();
}
}

我得到的错误是:

Exception in thread "main" org.apache.flink.table.api.ValidationException: Column types of query result and sink for registered table 'default_catalog.default_database.Unregistered_DataStream_Sink_1' do not match.
Cause: Incompatible types for sink column 'event_time' at position 2.
Query schema: [name: STRING, score: INT, event_time: RAW('java.time.LocalDateTime', '...'), address: *flinkSqlExperiments.HybridTrial$Address<`street` STRING, `houseNumber` STRING>*]
Sink schema:  [name: STRING, score: INT, event_time: TIMESTAMP(9), address: *flinkSqlExperiments.HybridTrial$Address<`street` STRING, `houseNumber` STRING>*]
at org.apache.flink.table.planner.connectors.DynamicSinkUtils.createSchemaMismatchException(DynamicSinkUtils.java:437)
at org.apache.flink.table.planner.connectors.DynamicSinkUtils.validateSchemaAndApplyImplicitCast(DynamicSinkUtils.java:256)
at org.apache.flink.table.planner.connectors.DynamicSinkUtils.convertSinkToRel(DynamicSinkUtils.java:198)
at org.apache.flink.table.planner.connectors.DynamicSinkUtils.convertExternalToRel(DynamicSinkUtils.java:143)

我也尝试了从数据流到表的自定义转换,但是从表转换为数据流时仍然遇到错误。我被困住了,所以任何帮助都值得赞赏。

DataStream 中基于反射的自动类型提取不如表 API 强大。这也是由于数据流 API 中的状态向后兼容性问题。

event_time字段是数据流 API 中的GenericType,它会导致表 API 中的RAW。您有以下可能性:

  • fromElements中给出适当的TypeInformation
  • fromDataStream中使用DataType覆盖TypeInformation

它通过使用以下方法注册 POJO 解决了我的问题

env.getConfig().registerPojoType(YourClass.class);

您可以使用任何用户定义的 DTO 并注册为 POJO

最新更新