intellij: errors with scala, spark



我有一个使用spark库的scala项目,它在大多数情况下都很好(使用intellij(。但有时它会在intellij发布时出现错误:

[warn]  Note: Unresolved dependencies path:
[error] stack trace is suppressed; run 'last update' for the full output
[error] stack trace is suppressed; run 'last ssExtractDependencies' for the full output
[error] (update) sbt.librarymanagement.ResolveException: Error downloading org.apache.spark:spark-core_2.12:3.0.0-preview2
[error]   Not found
[error]   Not found
[error]   not found: C:....ivy2localorg.apache.sparkspark-core_2.123.0.0-preview2ivysivy.xml
[error]   checksum format error: C:....AppDataLocalCoursierCachev1httpsrepo1.maven.orgmaven2orgapachesparkspark-core_2.123.0.0-preview2.spark-core_2.12-3.0.0-preview2.pom__sha1
[error]   checksum format error: C:....AppDataLocalCoursierCachev1httpsrepo1.maven.orgmaven2orgapachesparkspark-core_2.123.0.0-preview2.spark-core_2.12-3.0.0-preview2.pom__sha1
[error] Error downloading org.apache.spark:spark-sql_2.12:3.0.0-preview2
[error]   Not found
[error]   Not found
[error]   not found: C:....ivy2localorg.apache.sparkspark-sql_2.123.0.0-preview2ivysivy.xml
[error]   checksum format error: C:....AppDataLocalCoursierCachev1httpsrepo1.maven.orgmaven2orgapachesparkspark-sql_2.123.0.0-preview2.spark-sql_2.12-3.0.0-preview2.pom__sha1
[error] (ssExtractDependencies) sbt.librarymanagement.ResolveException: Error downloading org.apache.spark:spark-core_2.12:3.0.0-preview2
[error]   Not found
[error]   Not found
[error]   not found: C:....ivy2localorg.apache.sparkspark-core_2.123.0.0-preview2ivysivy.xml
[error]   checksum format error: C:....AppDataLocalCoursierCachev1httpsrepo1.maven.orgmaven2orgapachesparkspark-core_2.123.0.0-preview2.spark-core_2.12-3.0.0-preview2.pom__sha1
[error] Error downloading org.apache.spark:spark-sql_2.12:3.0.0-preview2
[error]   Not found
[error]   Not found
[error]   not found: C:....ivy2localorg.apache.sparkspark-sql_2.123.0.0-preview2ivysivy.xml
[error]   checksum format error: C:....AppDataLocalCoursierCachev1httpsrepo1.maven.orgmaven2orgapachesparkspark-sql_2.123.0.0-preview2.spark-sql_2.12-3.0.0-preview2.pom__sha1
[error] Total time: 1 s, completed 17 Sep 2022, 15:13:49
[info] shutting down sbt server

build.sbt是:

/*ThisBuild / version := "0.1.0-SNAPSHOT"
ThisBuild / scalaVersion := "2.13.8"
lazy val root = (project in file("."))
.settings(
name := "spark-learning"
)*/
// Name of the package
name := "spark-learning"
// Version of our package
version := "1.0"
// Version of Scala
scalaVersion := "2.12.14"
// Spark library dependencies
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-core" % "3.0.0-preview2",
"org.apache.spark" %% "spark-sql" % "3.0.0-preview2"
)

是什么突然导致了这些问题?我该如何摆脱?

错误消息中有一件奇怪的事情是它查找依赖项的路径:...ivy2localorg.apache.spark...

local之后应该有一个

你的SBT存储库配置可能会被搞砸吗?不确定它在Windows上的位置,它是/etc/sbt/repositories,通常在Linux上,但Intellij可能有自己的设置。

最新更新