SBT:无法解决以前工作的依赖性



我的build.sbt看起来像这样:

import sbt._
name := "spark-jobs"
version := "0.1"
scalaVersion := "2.11.8"
resolvers += "Spark Packages Repo" at "https://dl.bintray.com/spark-packages/maven"
// additional libraries
libraryDependencies ++= Seq(
  "org.apache.spark" % "spark-core_2.11" % "2.2.0" % "provided",
  "org.apache.spark" % "spark-streaming_2.11" % "2.2.0",
  "org.apache.spark" % "spark-sql_2.11" % "2.2.0" % "provided",
  "org.apache.spark" % "spark-streaming-kafka-0-10_2.11" % "2.2.0"
)
assemblyMergeStrategy in assembly := {
  case PathList("META-INF", xs @ _*) => MergeStrategy.discard
  case x => MergeStrategy.first
}

这曾经起作用,直到我决定在spark-streaming_2.11末尾添加另一个% "provided",会发生什么。它无法解决依赖性,我继续前进并恢复了变化。但是,这似乎也给了我一个例外。现在,我的build.sbt看起来完全像是过去的一切。不过,它给了我这个例外:

[error] (*:update) sbt.librarymanagement.ResolveException: unresolved dependency: org.apache.spark#spark-streaming_2.11;2.2.0: org.apache.spark#spark-parent_2.11;2.2.0!spark-parent_2.11.pom(pom.original) origin location must be absolute: file:/home/aswin/.m2/repository/org/apache/spark/spark-parent_2.11/2.2.0/spark-parent_2.11-2.2.0.pom

SBT的行为使我有些困惑。有人可以引导我,就像为什么会发生这种情况吗?欢迎任何好的博客/资源以了解SBT在引擎盖下的确切工作方式。

这是我的项目/汇编.sbt:

addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.14.6")

project/build.properties:

sbt.version = 1.0.4

project/plugins.sbt:

resolvers += Resolver.url("artifactory", url("http://scalasbt.artifactoryonline.com/scalasbt/sbt-plugin-releases"))(Resolver.ivyStylePatterns)
resolvers += "Typesafe Repository" at "http://repo.typesafe.com/typesafe/releases/"

谢谢!

如果您在SBT控制台中,只需运行reload命令,然后重试即可。更新依赖项或SBT插件后,您需要重新加载项目,以便更改生效。

顺便说一句,您可以只使用%%运算符,而不是在依赖项中定义Scala版本,它将根据您定义的Scala版本获取适当的依赖项。

// additional libraries
libraryDependencies ++= Seq(
  "org.apache.spark" %% "spark-core" % "2.2.0" % "provided",
  "org.apache.spark" %% "spark-streaming" % "2.2.0",
  "org.apache.spark" %% "spark-sql" % "2.2.0" % "provided",
  "org.apache.spark" %% "spark-streaming-kafka-0-10" % "2.2.0"
)

最新更新