如何在火花流中定期更新 rdd



我的代码是这样的:

sc = SparkContext()
ssc = StreamingContext(sc, 30)
initRDD = sc.parallelize('path_to_data')
lines = ssc.socketTextStream('localhost', 9999)
res = lines.transform(lambda x: x.join(initRDD))
res.pprint()

我的问题是initRDD需要每天午夜更新

我尝试这样做:

sc = SparkContext()
ssc = StreamingContext(sc, 30)
lines = ssc.socketTextStream('localhost', 9999)

def func(rdd):
initRDD = rdd.context.parallelize('path_to_data')
return rdd.join(initRDD)

res = lines.transform(func)
res.pprint()

但似乎initRDD将每 30 秒更新一次,这与batchDuration

有什么好的理想吗

一种选择是在transform之前检查截止日期。检查是一个简单的比较,因此在每个批次间隔进行的成本较低:

def nextDeadline() : Long = {
// assumes midnight on UTC timezone.
LocalDate.now.atStartOfDay().plusDays(1).toInstant(ZoneOffset.UTC).toEpochMilli()
}
// Note this is a mutable variable!
var initRDD = sparkSession.read.parquet("/tmp/learningsparkstreaming/sensor-records.parquet")
// Note this is a mutable variable!
var _nextDeadline = nextDeadline()
val lines = ssc.socketTextStream("localhost", 9999)
// we use the foreachRDD as a scheduling trigger. 
// We don't use the data, only the execution hook
lines.foreachRDD{ _ => 
if (System.currentTimeMillis > _nextDeadline) {
initRDD = sparkSession.read.parquet("/tmp/learningsparkstreaming/sensor-records.parquet")
_nextDeadline = nextDeadline()
}
}
// if the rdd was updated, it will be picked up in this stage.
val res = lines.transform(rdd => rdd.join(initRDD))

最新更新