错误:对象XML不是com.databricks.spark的软件包的成员



我正在尝试使用SBT读取XML文件,但是在编译时我会面临问题。

build.sbt

name:= "First Spark"
version:= "1.0"
organization := "in.goai"
scalaVersion := "2.11.8"
libraryDependencies += "org.apache.spark" %% "spark-core" % "2.0.0"
libraryDependencies += "org.apache.spark" %% "spark-sql" % "2.0.0"
libraryDependencies += "com.databricks" % "spark-avro_2.10" % "2.0.1"
libraryDependencies += "org.scala-lang.modules" %% "scala-xml" % "1.0.2"
resolvers += Resolver.mavenLocal

.scala文件

package in.goai.spark
import scala.xml._
import com.databricks.spark.xml
import org.apache.spark.sql.SQLContext
import org.apache.spark.{SparkContext, SparkConf}
object SparkMeApp {
  def main(args: Array[String]) {
    val conf = new SparkConf().setAppName("First Spark")
    val sc = new SparkContext(conf)
    val sqlContext = new SQLContext(sc)
    val fileName = args(0)
    val df = sqlContext.read.format("com.databricks.spark.xml").option("rowTag", "book").load("fileName")
    val selectedData = df.select("title", "price")
    val d = selectedData.show
    println(s"$d")
  }
}

当我通过给出" SBT软件包"来编译它时,它显示了Bellow错误

[error] /home/hadoop/dev/first/src/main/scala/SparkMeApp.scala:4: object xml is not a member of package com.databricks.spark
[error] import com.databricks.spark.xml
[error]        ^
[error] one error found
[error] (compile:compileIncremental) Compilation failed
[error] Total time: 9 s, completed Sep 22, 2017 4:11:19 PM

我是否需要添加与XML有关的任何其他JAR文件?请建议,请给我任何链接,提供有关不同文件格式的JAR文件的信息

,因为您使用的是Scala 2.11和Spark 2.0,在build.sbt中,请将您的依赖项更改为以下内容:

libraryDependencies += "org.apache.spark" %% "spark-core" % "2.0.0"
libraryDependencies += "org.apache.spark" %% "spark-sql" % "2.0.0"
libraryDependencies += "com.databricks" %% "spark-avro" % "3.2.0"
libraryDependencies += "com.databricks" %% "spark-xml" % "0.4.1"
libraryDependencies += "org.scala-lang.modules" %% "scala-xml" % "1.0.6"
  1. spark-avro版本更改为3.2.0:https://github.com/databricks/spark-avro #requirments
  2. 添加"com.databricks" %% "spark-xml" % "0.4.1":https://github.com/databricks/spark-xml#scala-211
  3. scala-xml版本更改为1.0.6,Scala 2.11的当前版本:http://mvnrepository.com/artifact/org.scala-lang.modules/scala-xml_2.11

在您的代码中,删除以下导入语句:

import com.databricks.spark.xml

请注意,您的代码实际上不使用spark-avroscala-xml库。从您的build.sbt中删除这些依赖项(以及代码中的import scala.xml._语句),如果您不使用它们。

最新更新