ApacheSpark 流处理上的 ApacheBahir Stuctured Streaming 连接器的架构问题



我正在尝试将Apache Spark Structured Stream连接到MQTT主题(在本例中为IBM Bluemix上的IBM Watson物联网平台)。

我正在创建结构化流,如下所示:

val df = spark.readStream 
    .format("org.apache.bahir.sql.streaming.mqtt.MQTTStreamSourceProvider")
    .option("username","<username>")
    .option("password","<password>")
    .option("clientId","a:vy0z2s:a-vy0z2s-zfzzckrnqf")
    .option("topic", "iot-2/type/WashingMachine/id/Washer02/evt/voltage/fmt/json")
    .load("tcp://vy0z2s.messaging.internetofthings.ibmcloud.com:1883")

到目前为止一切顺利,在 REPL 中,我得到这个 df 对象,如下所示:

df: org.apache.spark.sql.DataFrame = [value: string, timestamp: timestamp]

但是,如果我开始使用以下行从流中读取:

val query = df.writeStream
    .outputMode("append")
    .format("console")
    .start()

我收到以下错误:

scala> 17/02/03 07:32:23 ERROR StreamExecution: Query query-1
terminated with error java.lang.ClassCastException: scala.Tuple2
cannot be cast to scala.runtime.Nothing$    at
org.apache.bahir.sql.streaming.mqtt.MQTTTextStreamSource$$anonfun$getBatch$1$$anonfun$3.apply(MQTTStreamSource.scala:156)
    at
org.apache.bahir.sql.streaming.mqtt.MQTTTextStreamSource$$anonfun$getBatch$1$$anonfun$3.apply(MQTTStreamSource.scala:156)
    at scala.collection.MapLike$class.getOrElse(MapLike.scala:128)  at
scala.collection.concurrent.TrieMap.getOrElse(TrieMap.scala:633)    at
org.apache.bahir.sql.streaming.mqtt.MQTTTextStreamSource$$anonfun$getBatch$1.apply$mcZI$sp(MQTTStreamSource.scala:156)
    at
org.apache.bahir.sql.streaming.mqtt.MQTTTextStreamSource$$anonfun$getBatch$1.apply(MQTTStreamSource.scala:155)
    at
org.apache.bahir.sql.streaming.mqtt.MQTTTextStreamSource$$anonfun$getBatch$1.apply(MQTTStreamSource.scala:155)
    at scala.collection.immutable.Range.foreach(Range.scala:160)    at
org.apache.bahir.sql.streaming.mqtt.MQTTTextStreamSource.getBatch(MQTTStreamSource.scala:155)
    at
org.apache.spark.sql.execution.streaming.StreamExecution$$anonfun$5.apply(StreamExecution.scala:332)
    at
org.apache.spark.sql.execution.streaming.StreamExecution$$anonfun$5.apply(StreamExecution.scala:329)
    at
scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:241)
    at
scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:241)
    at scala.collection.Iterator$class.foreach(Iterator.scala:893)  at
scala.collection.AbstractIterator.foreach(Iterator.scala:1336)  at
scala.collection.IterableLike$class.foreach(IterableLike.scala:72)  at
org.apache.spark.sql.execution.streaming.StreamProgress.foreach(StreamProgress.scala:25)
    at
scala.collection.TraversableLike$class.flatMap(TraversableLike.scala:241)
    at
org.apache.spark.sql.execution.streaming.StreamProgress.flatMap(StreamProgress.scala:25)
    at
org.apache.spark.sql.execution.streaming.StreamExecution.org$apache$spark$sql$execution$streaming$StreamExecution$$runBatch(StreamExecution.scala:329)
    at
org.apache.spark.sql.execution.streaming.StreamExecution$$anonfun$org$apache$spark$sql$execution$streaming$StreamExecution$$runBatches$1.apply$mcZ$sp(StreamExecution.scala:194)
    at
org.apache.spark.sql.execution.streaming.ProcessingTimeExecutor.execute(TriggerExecutor.scala:43)
    at
org.apache.spark.sql.execution.streaming.StreamExecution.org$apache$spark$sql$execution$streaming$StreamExecution$$runBatches(StreamExecution.scala:184)
    at
org.apache.spark.sql.execution.streaming.StreamExecution$$anon$1.run(StreamExecution.scala:120)
17/02/03 07:32:24 WARN MQTTTextStreamSource: Connection to mqtt server
lost. Connection lost (32109) - java.io.EOFException    at
org.eclipse.paho.client.mqttv3.internal.CommsReceiver.run(CommsReceiver.java:146)
    at java.lang.Thread.run(Thread.java:745) Caused by:
java.io.EOFException    at
java.io.DataInputStream.readByte(DataInputStream.java:267)  at
org.eclipse.paho.client.mqttv3.internal.wire.MqttInputStream.readMqttWireMessage(MqttInputStream.java:65)
    at
org.eclipse.paho.client.mqttv3.internal.CommsReceiver.run(CommsReceiver.java:107)
    ... 1 more 17/02/03 07:32:28 WARN MQTTTextStreamSource: Connection to
mqtt server lost.

我的直觉说架构有问题,所以我添加了一个:

import org.apache.spark.sql.types._ val 
schema = StructType(
    StructField("count",LongType,true)::
    StructField("flowrate",LongType,true)::
    StructField("fluidlevel",StringType,true)::
    StructField("frequency",LongType,true)::
    StructField("hardness",LongType,true)::
    StructField("speed",LongType,true)::
    StructField("temperature",LongType,true)::
    StructField("ts",LongType,true)::
    StructField("voltage",LongType,true):: Nil)
val df = spark.readStream 
    .schema(schema)
    .format("org.apache.bahir.sql.streaming.mqtt.MQTTStreamSourceProvider")
    .option("username","<username>")
    .option("password","<password>")
    .option("clientId","a:vy0z2s:a-vy0z2s-zfzzckrnqf")
    .option("topic", "iot-2/type/WashingMachine/id/Washer02/evt/voltage/fmt/json")
    .load("tcp://vy0z2s.messaging.internetofthings.ibmcloud.com:1883")

但这无济于事,有什么想法吗?

您的问题似乎是因为您在后续连接中重复使用相同的客户端 ID

Closing TCP connection:   ClientID="a:vy0z2s:a-vy0z2s-xxxxxxxxxx" Protocol=mqtt4-tcp Endpoint="mqtt"   RC=288 Reason="The client ID was reused."  

每个 clientID 只允许一个唯一连接;不能有两个并发连接使用相同的 ID。

请检查客户端 ID,

并确保同一应用程序的多个实例使用唯一的客户端 ID。 应用程序可以共享相同的 API 密钥,但 MQTT 要求客户端 ID 始终是唯一的。

相关内容

  • 没有找到相关文章

最新更新