运行时错误:Spark 出现 ClassNotFound 异常



我想自己运行Spark流(版本1.6)的OOTB示例,以在其上构建。我能够按原样编译和运行该示例,并与其他代码示例捆绑在一起。即:

 ./bin/run-example streaming.StatefulNetworkWordCount localhost 9999

但是我无法在我自己的项目中这样做(相同的代码)。有什么帮助吗?

build.sbt:

import sbtassembly.AssemblyKeys
name := "stream-test"
version := "1.0"

libraryDependencies += "org.apache.spark" %% "spark-core" % "1.6.0" % "provided"
libraryDependencies += "org.apache.spark" %% "spark-streaming-kafka" % "1.6.0"
libraryDependencies += "org.apache.spark" %% "spark-streaming" % "1.6.0"
libraryDependencies += "org.json4s" %% "json4s-native" % "3.2.10"
libraryDependencies += "org.scalatest" %% "scalatest" % "2.2.4" % "test"
assemblyJarName in assembly := "stream_test_" + version.value + ".jar"
assemblyMergeStrategy in assembly := {
  case PathList("org", "apache", xs @ _*) => MergeStrategy.last
  case PathList("com", "google", xs @ _*) => MergeStrategy.last
  case PathList("com", "esotericsoftware", xs @ _*) => MergeStrategy.last
  case x =>
    val oldStrategy = (assemblyMergeStrategy in assembly).value
    oldStrategy(x)
}
assemblyOption in assembly := (assemblyOption in assembly).value.copy(includeScala = false)

这编译得很好。但是,运行时出现错误:(请注意,我正在使用Spark 1.6来运行它):

    $ ../../../app/spark-1.6.0-bin-hadoop2.6/bin/spark-submit --jars /app/spark-streaming_2.11-1.6.0.jar   --master local[4] --class "StatefulNetworkWordCount"    ./target/scala-2.10/stream-test_2.10-1.0.jar localhost 9999
    Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
    16/02/10 22:16:30 INFO SparkContext: Running Spark version 1.4.1
    2016-02-10 22:16:32.451 java[86932:5664316] Unable to load realm info from SCDynamicStore
    16/02/10 22:16:33 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
   ..
    16/02/10 22:16:39 INFO Utils: Successfully started service 'sparkDriver' on port 60720.
    16/02/10 22:16:40 INFO SparkEnv: Registering MapOutputTracker
    16/02/10 22:16:40 INFO SparkEnv: Registering BlockManagerMaster
    16/02/10 22:16:42 INFO SparkUI: Started SparkUI at http://xxx:4040
    16/02/10 22:16:43 INFO SparkContext: Added JAR file://app/spark-streaming_2.11-1.6.0.jar at http://xxx:60721/jars/spark-streaming_2.11-1.6.0.jar with timestamp 1455171403485
    16/02/10 22:16:43 INFO SparkContext: Added JAR file:/projects/spark/test/./target/scala-2.10/stream-test_2.10-1.0.jar at http://xxx:60721/jars/stream-test_2.10-1.0.jar with timestamp 1455171403562
    16/02/10 22:16:44 INFO Executor: Starting executor ID driver on host localhost
    16/02/10 22:16:44 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 60722.
  ..
    16/02/10 22:16:44 INFO BlockManagerMaster: Registered BlockManager
    Exception in thread "main" java.lang.NoSuchMethodError: org.apache.spark.streaming.dstream.PairDStreamFunctions.mapWithState(Lorg/apache/spark/streaming/StateSpec;Lscala/reflect/ClassTag;Lscala/reflect/ClassTag;)Lorg/apache/spark/streaming/dstream/MapWithStateDStream;
        at StatefulNetworkWordCount$.main(StatefulNetworkWordCount.scala:50)
        at StatefulNetworkWordCount.main(StatefulNetworkWordCount.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:665)
        at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:170)
        at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:193)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:112)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

这种方法在 Jar 的类中,所以我不明白..

找到了答案..即使我从 1.6 运行 spark-submit,我的SPARK_HOME仍然指向以前版本的 Spark 1.4。因此,将SPARK_HOME设置为 1.6 版本,与运行代码的 Spark-submit 相同,解决了这个问题。

相关内容

最新更新