我正在尝试使用flink来读取来自kafka的数据,并将结果返回到不同的kafka主题,但会遇到以下错误。`org.apache.flink.api.common.invalidprogramexception:mapfunction的实现是不可序列化的。该对象可能包含或引用非序列化字段。
` 我正在从卡夫卡(Kafka(收到消息 - 对其进行一些操纵,并返回我想发送给其他主题的对象列表。
class Wrapper implements Serializable{
@JsonProperty("viewBuilderRequests")
private ArrayList<ViewBuilderRequest> viewBuilderRequests;
public Wrapper(){}
public Wrapper(ArrayList<ViewBuilderRequest> viewBuilderRequests) {
this.viewBuilderRequests = viewBuilderRequests;
}
public List<ViewBuilderRequest> getViewBuilderRequests() {
return viewBuilderRequests;
}
public void setViewBuilderRequests(ArrayList<ViewBuilderRequest> viewBuilderRequests) {
this.viewBuilderRequests = viewBuilderRequests;
}
}
public class ViewBuilderRequest implements Serializable {
private CdmId cdmId
private ViewBuilderOperation operation
private List<ViewUserSystemIdentifier> viewUserSystemIdentifiers
public ViewBuilderRequest(){
}
public CdmId getCdmId() {
return cdmId;
}
public void setCdmId(CdmId cdmId) {
this.cdmId = cdmId;
}
public ViewBuilderOperation getOperation() {
return operation;
}
public void setOperation(ViewBuilderOperation operation) {
this.operation = operation;
}
public List<ViewUserSystemIdentifier> getViewUserSystemIdentifiers() {
return viewUserSystemIdentifiers;
}
public void setViewUserSystemIdentifiers(List<ViewUserSystemIdentifier> viewUserSystemIdentifiers) {
this.viewUserSystemIdentifiers = viewUserSystemIdentifiers;
}
public enum ViewBuilderOperation implements Serializable{
Create, Update,Delete
}
private MapFunction<String, Wrapper> parseAndSendToGraphProcessing = s ->{
UserMatchingRequest userMatchingRequest = objectMapper.readValue(s, UserMatchingRequest.class);
Wrapper wrapper = new Wrapper(janusGraphDataProcessing.handleMessage(userMatchingRequest));
return wrapper;
};
内类实现可序列化的
例外是从此代码中抛出的:
dataStream.map(parseAndSendToGraphProcessing)
.addSink(new FlinkKafkaProducer<Wrapper>(kafkaConfiguration.getBootstrapServers(),"graphNotifications",new WrapperSchema()));
我也对两个对象都有DE/序列化。
public class WrapperSchema implements DeserializationSchema<Wrapper>, SerializationSchema<Wrapper> {
// private final static ObjectMapper objectMapper = new ObjectMapper().configure(MapperFeature.ACCEPT_CASE_INSENSITIVE_PROPERTIES, true);
static ObjectMapper objectMapper = new ObjectMapper();
@Override
public Wrapper deserialize(byte[] message) throws IOException {
return objectMapper.readValue(message, Wrapper.class);
}
@Override
public boolean isEndOfStream(Wrapper nextElement) {
return false;
}
@Override
public byte[] serialize(Wrapper element) {
// return element.toString().getBytes();
if(objectMapper == null) {
objectMapper.setVisibility(PropertyAccessor.FIELD, JsonAutoDetect.Visibility.ANY);
objectMapper = new ObjectMapper();
}
try {
String json = objectMapper.writeValueAsString(element);
return json.getBytes();
} catch (JsonProcessingException e) {
e.printStackTrace();
}
return new byte[0];
}
@Override
public TypeInformation<Wrapper> getProducedType() {
return TypeInformation.of(Wrapper.class);
}
}
flink可以同时使用消息和地图功能。
据我所知,您的消息看起来是可序列化的。
但是您的地图功能不是。有时候很难让lambda序列化。我认为在您的情况下,问题是parseAndSendToGraphProcessing
使用的是objectMapper
和janusGraphDataProcessing
,必须序列化。
我的猜测是janusGraphDataProcessing
不可序列化(如果您使用Jackson 2.1或更高版本(。
如果是这种情况,那么一个工作是编写一个自定义的richmapfunction类,该类将janusGraphDataProcessing
存储为瞬态变量,并将其初始化在其open
函数中。
private MapFunction<String, Wrapper> parseAndSendToGraphProcessing = s ->{
UserMatchingRequest userMatchingRequest = objectMapper.readValue(s, UserMatchingRequest.class);
Wrapper wrapper = new Wrapper(janusGraphDataProcessing.handleMessage(userMatchingRequest));
return wrapper;
};