我有一个问题:是否可以从spark.sql.row?
获得自定义对象当前代码能够将数据推到激发行,但我无法将其提取。
首先有一个简单的pojo对象:
public class Event implements Serializable {
private Map<String, Object> fields;
public Event() {
}
public Event(Map<String, Object> fields) {
this.fields = fields;
}
public Map<String, Object> getFields() {
return fields;
}
public void setFields(Map<String, Object> fields) {
this.fields = fields;
}
}
作为下一步,我们使用Spark Streaming API创建了Tuple2(字符串,事件)的Javadstream。之后,我们将每个RDD转换为数据集。
JavaDStream<Tuple2<String, Event>> events = ...
events.foreachRDD(tuple2JavaRDD -> {
SparkSession sparkSession = SparkSession.builder().config(tuple2JavaRDD.context().conf()).getOrCreate();
Dataset<Row> dataSet = sparkSession.createDataset(tuple2JavaRDD.rdd(),
Encoders.tuple(Encoders.STRING(), Encoders.bean(Event.class))).toDF("EventType", "Event");
//try to get data back
Dataset<Event> eventsSet = dataSet.map((MapFunction<Row, Event>) row -> row.<Event>getAs(1), Encoders.bean(Event.class));
//and getting an exception when try to get the element from stream
eventsSet.show();
});
}
这是我遇到的错误:
java.lang.classcastException:org.apache.spark.sql.catalyst.catalyst.expressions.genericrowwithschema不能将其施加到event
怎么样
eventsSet.select("Event").as(Encoders.bean(Event.class));
对于您的代码,您应该逐步构造对象:
...
Event event = new Event();
event.setFields(row.getAs(...));
return event;