Build.sbt 在添加 GraphFrame build with scala 2.11 时中断



我正在尝试将GraphFrames添加到我的scala spark应用程序中,当我添加基于2.10的应用程序时,这很顺利。但是,一旦我尝试使用GraphFrames构建scala 2.11构建它,它就会崩溃。

问题是使用了冲突的scala版本(2.10和2.11)。我收到以下错误:

[error] Modules were resolved with conflicting cross-version suffixes in {file:/E:/Documents/School/LSDE/hadoopcryptoledger/examples/scala-spark-graphx-bitcointransaction/}root:
[error]    org.apache.spark:spark-launcher _2.10, _2.11
[error]    org.json4s:json4s-ast _2.10, _2.11
[error]    org.apache.spark:spark-network-shuffle _2.10, _2.11
[error]    com.twitter:chill _2.10, _2.11
[error]    org.json4s:json4s-jackson _2.10, _2.11
[error]    com.fasterxml.jackson.module:jackson-module-scala _2.10, _2.11
[error]    org.json4s:json4s-core _2.10, _2.11
[error]    org.apache.spark:spark-unsafe _2.10, _2.11
[error]    org.apache.spark:spark-core _2.10, _2.11
[error]    org.apache.spark:spark-network-common _2.10, _2.11

但是,我无法解决导致此问题的原因。这是我的完整版本.sbt:

import sbt._
import Keys._
import scala._

lazy val root = (project in file("."))
.settings(
    name := "example-hcl-spark-scala-graphx-bitcointransaction",
    version := "0.1"
)
 .configs( IntegrationTest )
  .settings( Defaults.itSettings : _*)
scalacOptions += "-target:jvm-1.7"
crossScalaVersions := Seq("2.11.8")
resolvers += Resolver.mavenLocal
fork  := true
jacoco.settings
itJacoco.settings

assemblyJarName in assembly := "example-hcl-spark-scala-graphx-bitcointransaction.jar"
libraryDependencies += "com.github.zuinnote" % "hadoopcryptoledger-fileformat" % "1.0.7" % "compile"
libraryDependencies += "org.apache.spark" %% "spark-core" % "1.5.0" % "provided"
libraryDependencies += "org.apache.spark" %% "spark-graphx" % "1.5.0" % "provided"
libraryDependencies += "org.apache.hadoop" % "hadoop-client" % "2.7.0" % "provided"
libraryDependencies += "javax.servlet" % "javax.servlet-api" % "3.0.1" % "it"

libraryDependencies += "org.apache.hadoop" % "hadoop-common" % "2.7.0" % "it" classifier "" classifier "tests"
libraryDependencies += "org.apache.hadoop" % "hadoop-hdfs" % "2.7.0" % "it" classifier "" classifier "tests"
libraryDependencies += "org.apache.hadoop" % "hadoop-minicluster" % "2.7.0" % "it"
libraryDependencies += "org.apache.spark" % "spark-sql_2.11" % "2.2.0" % "provided"
libraryDependencies += "org.scalatest" %% "scalatest" % "3.0.1" % "test,it"
libraryDependencies += "graphframes" % "graphframes" % "0.5.0-spark2.1-s_2.11"

任何人都可以确定哪个依赖项基于 scala 2.10 导致构建失败?

我发现了问题所在。显然,如果您使用:

libraryDependencies += "org.apache.spark" %% "spark-core" % "1.5.0" % "provided"

默认情况下,它使用 2.10 版本。一旦我将 Spark 核心和 spark graphx 的依赖项更改为:

libraryDependencies += "org.apache.spark" % "spark-core_2.11" % "2.2.0"
libraryDependencies += "org.apache.spark" % "spark-graphx_2.11" % "2.2.0" % "provided"

相关内容

  • 没有找到相关文章

最新更新