Spark 结构化流式传输 avro 到 avro 和自定义接收器



有人可以向我推荐一个在 S3 或任何文件系统中编写 avro 的好例子或示例吗?我正在使用自定义接收器,但我想通过 SinkProvider 的构造函数传递一些属性 Map,我想可以进一步传递给接收器?

更新的代码:

val query = df.mapPartitions { itr =>
  itr.map { row =>
    val rowInBytes = row.getAs[Array[Byte]]("value")
    MyUtils.deserializeAvro[GenericRecord](rowInBytes).toString
  }
}.writeStream
  .format("com.test.MyStreamingSinkProvider")
  .outputMode(OutputMode.Append())
  .queryName("testQ" )
  .trigger(ProcessingTime("10 seconds"))
  .option("checkpointLocation", "my_checkpoint_dir")
  .start()
query.awaitTermination()

接收器提供商:

class MyStreamingSinkProvider extends StreamSinkProvider {
  override def createSink(sqlContext: SQLContext, parameters: Map[String, String], partitionColumns: Seq[String], outputMode: OutputMode): Sink = {
    new MyStreamingSink
  }
}

沉:

class MyStreamingSink extends Sink with Serializable {
  final val log: Logger = LoggerFactory.getLogger(classOf[MyStreamingSink])
  override def addBatch(batchId: Long, data: DataFrame): Unit = {
    //For saving as text doc
    data.rdd.saveAsTextFile("path")
    log.warn(s"Total records processed: ${data.count()}")
    log.warn("Data saved.")
  }
}

你应该能够通过writeStream.option(key, value)将参数传递到自定义接收器:

DataStreamWriter writer = dataset.writeStream()
  .format("com.test.MyStreamingSinkProvider")
  .outputMode(OutputMode.Append())
  .queryName("testQ" )
  .trigger(ProcessingTime("10 seconds"))
  .option("key_1", "value_1")
  .option("key_2", "value_2")
  .start()

在这种情况下,方法MyStreamingSinkProvider.createSink(...)中的parameters将包含key_1key_2

最新更新