我有一个SBT多项目设置概述https://github.com/geoheil/sf-sbt-sbt-multiproject-depperency-problemboy,并希望能够在root Project中执行sbt console
。
执行时:
import org.apache.spark.sql.SparkSession
val spark = SparkSession.builder().master("local[*]").enableHiveSupport.getOrCreate
spark.sql("CREATE database foo")
根控制台中的错误是:
java.lang.NoClassDefFoundError: Could not initialize class org.apache.derby.jdbc.EmbeddedDriver
奇怪的是,它在子项目中工作正常:
sbt
project common
console
现在粘贴了相同的代码。
问题
- 如何修复SBT控制台直接加载正确的依赖项?
- 如何直接从Sub项目中加载控制台?SBT Common/Console似乎无法解决问题。
详细信息
以下最重要的设置:
lazy val global = project
.in(file("."))
.settings(
settings,
libraryDependencies ++= commonDependencies
)
.aggregate(
common
)
.dependsOn(
common
)
lazy val common = project
.settings(
name := "common",
settings,
libraryDependencies ++= commonDependencies
)
lazy val dependencies =
new {
val sparkV = "2.3.0"
val sparkBase = "org.apache.spark" %% "spark-core" % sparkV % "provided"
val sparkSql = "org.apache.spark" %% "spark-sql" % sparkV % "provided"
val sparkHive = "org.apache.spark" %% "spark-hive" % sparkV % "provided"
}
lazy val commonDependencies = Seq(
dependencies.sparkBase,
dependencies.sparkHive,
dependencies.sparkSql
)
lazy val settings = commonSettings
lazy val commonSettings = Seq(
fork := true,
run in Compile := Defaults
.runTask(fullClasspath in Compile, mainClass.in(Compile, run), runner.in(Compile, run))
.evaluated
)
相关问题
- SBT多项目的传递依赖性错误
- SBT测试不适用于火花测试
编辑
奇怪的是:对于Spark版本2.2.0,此设置工作正常。仅2.2.1/2.3.0引起这些问题,但在单个项目设置或在正确的项目中启动游戏机时工作正常。
也
java.security.AccessControlException: access denied org.apache.derby.security.SystemPermission( "engine", "usederbyinternals" )'
在堆栈跟踪中提到。
实际上SBT测试不适用于火花测试使用:
的代码 if (appName === "dev") {
System.setSecurityManager(null)
}
正在修复它以进行开发。
https://github.com/holdenk/spark-testing-base/issues/148https://issues.apache.org/jira/browse/spark-22918