我试图构建具有火花依赖性的非常基本的scala脚本。但是我无法将罐子从中脱掉。
生成的错误:
sbt.resolve exception:未解决的依赖性:org.apache.spark#spark-core_2.12; 1.6.0-snapshot:找不到
我的build.sbt:
import Dependencies._
lazy val root = (project in file(".")).
settings(
inThisBuild(List(
organization := "com.example",
scalaVersion := "2.12.1",
version := "0.1.0-SNAPSHOT"
)),
name := "Hello",
libraryDependencies += "org.apache.spark" %% "spark-core" % "1.6.0-SNAPSHOT",
resolvers += Resolver.mavenLocal
)
`
package example
import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import org.apache.spark.SparkConf
object Hello {
def main(args: Array[String]) {
val logFile = "/Users/dhruvsha/Applications/spark/README.md"
val conf = new SparkConf().setAppName("Simple Application")
val sc = new SparkContext(conf)
val logData = sc.textFile(logFile, 2).cache()
val numAs = logData.filter(line => line.contains("a")).count()
val numBs = logData.filter(line => line.contains("b")).count()
println(s"Lines with a: $numAs, Lines with b: $numBs")
sc.stop()
}
}
我的源scala在:
中/exampleapp/main/scala/example/hello.scala
项目名称是extpleapp。
Scala版本2.12.2
火花版1.6.0
SBT版本0.13.13
任何类型的帮助都将不胜感激,如果您可以提供有关SBT和Spark依赖性的资源。
请看到我是Scala,Spark和SBT。
build.sbt
中的 library dependencies
行似乎是错误的
正确应为
libraryDependencies += "org.apache.spark" % "spark-core_2.10" % "1.6.0"