checkpoint SqlContext nullpointerException issue



我在应用程序中使用检查点,当应用程序以失败启动时,我在SQLContext上得到一个NullPointerException
我认为由于序列化/反序列化问题,应用程序无法恢复SQLContextSQLContext不可序列化吗?

这是我在下面的代码

    //DriverClass
    final JavaSparkContext javaSparkCtx = new JavaSparkContext(conf);
    final SQLContext sqlContext = new SQLContext(javaSparkCtx);
    JavaStreamingContextFactory javaStreamingContextFactory = new JavaStreamingContextFactory() {
        @Override
        public JavaStreamingContext create() { //only first time executed
            // TODO Auto-generated method stub
            JavaStreamingContext jssc = new JavaStreamingContext(javaSparkCtx, Durations.minutes(1));
            jssc.checkpoint(CHECKPOINT_DIRECTORY);
            HashMap < String, String > kafkaParams = new HashMap < String, String > ();
            kafkaParams.put("metadata.broker.list",
                            "abc.xyz.localdomain:6667");
            //....
            JavaDStream < String > fullMsg = messages
                                             .map(new MapFunction());
            fullMsg.foreachRDD(new SomeClass(sqlContext));
            return jssc;
        }
    };
}
//Closure Class
public class SomeClass implements Serializable, Function < JavaRDD < String > , Void > {
    SQLContext sqlContext;
    public SomeClass(SQLContext sqlContext) {
        // TODO Auto-generated constructor stub
        this.sqlContext = sqlContext;
    }
    public void doSomething() {
        this.sqlContext.createDataFrame();**// here is the nullpointerException**
    }
    //.......
}

SQLContext是可序列化的,因为Spark SQL需要在内部执行器端使用SQLContext。但是,您不应该将它序列化到Streaming检查点。相反,你应该像这样从rdd获得SQLContext sqlContext = SQLContext.getOrCreate(rdd.context());

有关更多详细信息,请参阅流式处理文档:http://spark.apache.org/docs/1.6.1/streaming-programming-guide.html#dataframe-和sql操作

相关内容

  • 没有找到相关文章

最新更新