对象kafka010不是包org.apache.spark.streaming的成员



我正在尝试使用启用SSL的安全kafka集群的消息。现在我已经为scala 2.13.6版本添加了依赖项。

name := "realtime-spark-streaming"
version := "0.1"
resolvers += "confluent" at "https://packages.confluent.io/maven/"
resolvers += "Public Maven Repository" at "https://repository.***.com/content/repositories/pangaea_releases"
resolvers += "Nexus Repository" at "https://repository.***.com/content/repositories/pangaea_releases/"
// https://mvnrepository.com/artifact/org.apache.spark/spark-core
libraryDependencies += "org.apache.spark" %% "spark-core" % "3.2.0"
// https://mvnrepository.com/artifact/org.apache.spark/spark-streaming
libraryDependencies += "org.apache.spark" %% "spark-streaming" % "3.2.0" % "provided"
libraryDependencies += "org.apache.spark" % "spark-streaming-kafka-0-10_2.11" % "3.2.0" % "2.1.3"
libraryDependencies += "org.slf4j" % "slf4j-simple" % "1.7.12"
// https://mvnrepository.com/artifact/org.apache.kafka/kafka
libraryDependencies += "org.apache.kafka" %% "kafka" % "6.1.0-ccs"
resolvers += Resolver.mavenLocal
scalaVersion := "2.13.6"

我的消费者应用是这样的

package main.scala
import org.apache.kafka.clients.consumer.ConsumerRecord
import org.apache.kafka.common.serialization.StringDeserializer
import org.apache.spark.streaming.kafka010._
import org.apache.spark.streaming.kafka010.LocationStrategies.PreferConsistent
import org.apache.spark.streaming.kafka010.ConsumerStrategies.Subscribe
object APP extends App {
println("Hello, World!")
val kafkaParams = Map[String, Object](
"bootstrap.servers" -> "localhost:9092,anotherhost:9092",
"key.deserializer" -> classOf[StringDeserializer],
"value.deserializer" -> classOf[StringDeserializer],
"group.id" -> "use_a_separate_group_id_for_each_stream",
"auto.offset.reset" -> "latest",
"enable.auto.commit" -> (false: java.lang.Boolean),
"security.protocol" -> "SSL",
"ssl.truststore.location" -> "/some-directory/kafka.client.truststore.jks",
"ssl.truststore.password" -> "test1234",
"ssl.keystore.location" -> "/some-directory/kafka.client.keystore.jks",
"ssl.keystore.password" -> "test1234",
"ssl.key.password" -> "test1234"
)

val topics = Array("topicA", "topicB")
val stream = KafkaUtils.createDirectStream[String, String](
streamingContext,
PreferConsistent,
Subscribe[String, String](topics, kafkaParams)
)
stream.map(record => (record.key, record.value))
}

当我构建包时,它没有显示任何依赖不匹配错误,但是当我运行我的kafka导入没有解决…我得到错误

object kafka010 is not a member of package org.apache.spark.streaming
import org.apache.spark.streaming.kafka010._

如果我的证书在资源目录中,我也不确定truststore和keystore的相对路径是什么。

不清楚为什么你有% "2.1.3"…SBT应该给你一个错误,说没有下载。

scala 2.13.6

则不能使用spark-streaming-kafka-0-10_*2.11*。您需要使用_2.13(仅存在于Spark 2.3.0)。您可以使用%%符号来确保始终如此

val sparkVersion = "3.2.0"
libraryDependencies += "org.apache.spark" %% "spark-streaming-kafka-0-10" % sparkVersion

同样,不要手动添加kafka-clients,你应该将provided添加到spark-core

相关内容

  • 没有找到相关文章

最新更新