Flink: scala版本冲突



我试图在IntelliJ中编译kafka示例。在对依赖关系进行了大量的处理之后,我遇到了这个问题,我似乎无法克服:

15/10/25 12:36:34 ERROR actor.ActorSystemImpl: Uncaught fatal error from thread [flink-akka.actor.default-dispatcher-4] shutting down ActorSystem [flink]
java.lang.NoClassDefFoundError: scala/runtime/AbstractPartialFunction$mcVL$sp
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(ClassLoader.java:760)
at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:455)
at java.net.URLClassLoader.access$100(URLClassLoader.java:73)
at java.net.URLClassLoader$1.run(URLClassLoader.java:367)
at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at org.apache.flink.runtime.jobmanager.MemoryArchivist.handleMessage(MemoryArchivist.scala:80)
at org.apache.flink.runtime.FlinkActor$class.receive(FlinkActor.scala:32)
at org.apache.flink.runtime.jobmanager.MemoryArchivist.org$apache$flink$runtime$LogMessages$$super$receive(MemoryArchivist.scala:59)
at org.apache.flink.runtime.LogMessages$class.receive(LogMessages.scala:26)
at org.apache.flink.runtime.jobmanager.MemoryArchivist.receive(MemoryArchivist.scala:59)
at akka.actor.ActorCell.newActor(ActorCell.scala:567)
at akka.actor.ActorCell.create(ActorCell.scala:587)
at akka.actor.ActorCell.invokeAll$1(ActorCell.scala:460)
at akka.actor.ActorCell.systemInvoke(ActorCell.scala:482)
at akka.dispatch.Mailbox.processAllSystemMessages(Mailbox.scala:282)
at akka.dispatch.Mailbox.run(Mailbox.scala:223)
at akka.dispatch.Mailbox.exec(Mailbox.scala:234)
at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)

Caused by: java.lang.ClassNotFoundException: scala.runtime.AbstractPartialFunction$mcVL$sp
at java.net.URLClassLoader$1.run(URLClassLoader.java:372)
at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 28 more

我遇到了一些概念,表明这是scala版本的问题。当前库列表:

flink-runtime-1.0-SNAPSHOT
flink-streaming-java-1.0-SNAPSHOT
flink-connector-kafka-1.0-SNAPSHOT
flink-java8-1.0-SNAPSHOT
flink-core-1.0-SNAPSHOT
flink-java-1.0-SNAPSHOT
org.apache.hadoop:hadoop-core:1.2.1
flink-clients-1.0-SNAPSHOT
org.apache.kafka:kafka-clients:0.8.2.2
org.apache.kafka:kafka_2.11:0.8.2.2
flink-optimizer-1.0-SNAPSHOT
org.apache.sling:org.apache.sling.commons.json:2.0.6
de.javakaffee:kryo-serializers:0.28
com.github.scopt:scopt_2.11:3.3.0
org.clapper:grizzled-slf4j_2.9.0:0.6.6
com.typesafe.akka:akka-osgi_2.11:2.4.0
com.typesafe.akka:akka-slf4j_2.11:2.4.0

我在哪里跑偏了?

问题确实是Scala版本不匹配。你把为Scala 2.11构建的依赖关系,例如org.apache.kafka:kafka_2.11:0.8.2.2与默认为Scala 2.10构建的Flink依赖关系混合在一起。

为Scala 2.11构建的依赖之一拉入scala-library:2.11 jar,它取代了Flink依赖所需的scala-library:2.10依赖。对于非Flink依赖,您可以使用为Scala 2.10构建的二进制文件,或者使用Scala 2.11构建和安装Flink。参见https://ci.apache.org/projects/flink/flink-docs-master/setup/building.html#build-flink-for-a-specific-scala-version了解如何使用不同的Scala版本构建Flink。

卡夫卡例子

如果你只是想把引用的Kafka例子的版本改为0.10-SNAPSHOT,你必须改变pom.xml文件中的Flink版本,你必须使用FlinkKafkaProducer而不是WriteIntoKafka.java文件中的KafkaSink。你不再需要SimpleStringSchema了。这就是您需要更改的全部内容(不需要额外的依赖项)。

相关内容

  • 没有找到相关文章

最新更新