Spark Scala :无法导入sqlContext.implicits._



我尝试了以下代码,但无法导入sqlContext.implicits._ - 它抛出错误(在 Scala IDE 中),无法构建代码:

值隐式不是 org.apache.spark.sql.SQLContext 的成员

我需要在pom.xml中添加任何依赖项吗?

火花版本 1.5.2

package com.Spark.ConnectToHadoop
import org.apache.spark.SparkConf
import org.apache.spark.SparkConf
import org.apache.spark._
import org.apache.spark.sql._
import org.apache.spark.sql.hive.HiveContext
import org.apache.spark.sql.SQLContext
import org.apache.spark.rdd.RDD
//import groovy.sql.Sql.CreateStatementCommand
//import org.apache.spark.SparkConf

object CountWords  {
  def main(args:Array[String]){
    val objConf = new SparkConf().setAppName("Spark Connection").setMaster("spark://IP:7077")
    var sc = new SparkContext(objConf)
val objHiveContext = new HiveContext(sc)
objHiveContext.sql("USE test")
var rdd= objHiveContext.sql("select * from Table1")
val options=Map("path" -> "hdfs://URL/apps/hive/warehouse/test.db/TableName")
//val sqlContext = new org.apache.spark.sql.SQLContext(sc)
   val sqlContext = new SQLContext(sc)
    import sqlContext.implicits._      //Error
val dataframe = rdd.toDF()
dataframe.write.format("orc").options(options).mode(SaveMode.Overwrite).saveAsTable("TableName")      
  }
}

我的pom.xml文件如下

<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
  xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
  <modelVersion>4.0.0</modelVersion>
  <groupId>com.Sudhir.Maven1</groupId>
  <artifactId>SparkDemo</artifactId>
  <version>0.0.1-SNAPSHOT</version>
  <packaging>jar</packaging>
  <name>SparkDemo</name>
  <url>http://maven.apache.org</url>
  <properties>
    <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
  </properties>
  <dependencies>
    <dependency>
      <groupId>org.apache.spark</groupId>
      <artifactId>spark-core_2.10</artifactId>
      <version>1.5.2</version>
    </dependency> 
    <dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-sql_2.10</artifactId>
    <version>1.0.0</version>
</dependency>
<dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-mllib_2.10</artifactId>
    <version>1.5.2</version>
</dependency>
<dependency>
      <groupId>org.apache.spark</groupId>
      <artifactId>spark-streaming_2.10</artifactId>
      <version>0.9.1</version>
    </dependency>
     <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-hive_2.10</artifactId>
        <version>1.2.1</version>
    </dependency>
  <dependency>
    <groupId>org.apache.hive</groupId>
    <artifactId>hive-jdbc</artifactId>
    <version>1.2.1</version>
</dependency>
    <dependency>
      <groupId>junit</groupId>
      <artifactId>junit</artifactId>
      <version>3.8.1</version>
      <scope>test</scope>
    </dependency>     
  </dependencies>
</project>

第一次创建

val sqlContext = new org.apache.spark.sql.SQLContext(sc)

现在我们有sqlContext w.r.t sc(这将在我们启动 Spark-shell 时自动可用)现在

import sqlContext.implicits._ 

随着 Spark 2.0.0 的发布(2016 年 7 月 26 日),现在应该使用以下方法:

import spark.implicits._  // spark = SparkSession.builder().getOrCreate()

https://databricks.com/blog/2016/08/15/how-to-use-sparksession-in-apache-spark-2-0.html

您使用旧版本的 Spark-SQL。将其更改为:

<dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-sql_2.10</artifactId>
    <version>1.5.2</version>
</dependency>

对于使用 sbt 构建的人,请将库版本更新为

libraryDependencies ++= Seq(
  "org.apache.spark" % "spark-core_2.12" % "2.4.6" % "provided",
  "org.apache.spark" % "spark-sql_2.12" % "2.4.6" % "provided"
)

然后导入 SqlImplicits,如下所示。

val spark = SparkSession.builder()
      .appName("appName")
      .getOrCreate()
    import spark.sqlContext.implicits._;

您也可以使用

<properties>
   <spark.version>2.2.0</spark.version>
</properties>

<dependency>
    <groupId>org.apache.spark</groupId>
     <artifactId>spark-core_2.11</artifactId>
      <version>${spark.version}</version>
 </dependency>
  <dependency>   
      <groupId>org.apache.spark</groupId>
       <artifactId>spark-sql_2.11</artifactId>
       <version>${spark.version}</version>
  </dependency>

最新更新