Cassandra Spark Connector 版本与 Spark 2.2 冲突



我在运行火花作业时收到以下错误。请为火花和卡桑德拉连接器建议正确的版本。

下面是我的构建.sbt

scalaVersion := "2.11.8"
 
libraryDependencies ++= Seq(
  "org.apache.spark" %% "spark-streaming" % "2.2.0-cdh6.0.1" % "provided",
  "org.apache.spark" %% "spark-core" % "2.2.0-cdh6.0.1" % "provided", // excludeAll ExclusionRule(organization = "javax.servlet"),
  "org.apache.spark" %% "spark-sql" % "2.2.0-cdh6.0.1" % "provided",
  "org.apache.spark" %% "`enter code here`spark-streaming-kafka-0-10" % "2.2.0-cdh6.0.1",
  "org.apache.hbase" % "hbase-client" % "2.0.0-cdh6.0.1",
  "org.apache.hbase" % "hbase-common" % "2.0.0-cdh6.0.1",
  "com.datastax.spark" %% "spark-cassandra-connector" % "2.0.10",
  "net.liftweb" %% "lift-json" % "3.3.0",
  "com.typesafe" % "config" % "1.2.1"
)

一旦在火花上提交作业,我就会收到以下错误

Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/commons/configuration/ConfigurationException
    at com.datastax.spark.connector.streaming.DStreamFunctions.saveToCassandra$default$4(DStreamFunctions.scala:47)
    at com.StreamingPrerequisiteLoad$.main(StreamingPrerequisiteLoad.scala:72)
    at com.StreamingPrerequisiteLoad.main(StreamingPrerequisiteLoad.scala)
Caused by: java.lang.ClassNotFoundException: org.apache.commons.configuration.ConfigurationException
    at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:424)

我在火花卡桑德拉连接器上遇到了类似的问题,所以它是如何工作的,看看,对于 Spark 版本 2.2 和 scala 11.8.0,spark-cassandra-connector 2.3.0 将起作用。也添加 commons-configuration 1.9 version jar ,因为它会抛出异常 NoClassDefFound:/org/apache/commons/configuration/ConfigurationException 。尝试使用以下依赖项:

    version := "0.1"
scalaVersion := "2.11.8"
libraryDependencies += "org.apache.spark" %% "spark-core" % "2.2.0" % "provided"
libraryDependencies += "org.apache.spark" %% "spark-streaming" % "2.2.0" % "provided"
libraryDependencies += "org.apache.spark" %% "spark-sql" % "2.2.0" % "provided"
libraryDependencies += "net.liftweb" %% "lift-json" % "3.0.2"
libraryDependencies += "org.apache.spark" %% "spark-streaming-kafka-0-10" % "2.0.0" //% "provided"
libraryDependencies += "com.datastax.spark" %% "spark-cassandra-connector" % "2.3.0" //% "provided"
libraryDependencies += "commons-configuration" % "commons-configuration" % "1.9" //% "provided"
assemblyMergeStrategy in assembly := {
  case PathList("org", "apache", "spark", "unused", "UnusedStubClass.class") => MergeStrategy.first
  case x => (mergeStrategy in assembly).value(x)
}

相关内容

  • 没有找到相关文章

最新更新