线程"main" java.lang.NoClassDefFoundError中的异常: org/apache/spark/sql/catalyst/analysis/OverrideFunction



我已经尝试在 Spark 和 scala 中使用以下代码,附加代码和 pom.xml

package com.Spark.ConnectToHadoop
import org.apache.spark.SparkConf
import org.apache.spark.SparkConf
import org.apache.spark._
import org.apache.spark.sql._
import org.apache.spark.sql.hive.HiveContext
import org.apache.spark.sql.SQLContext
import org.apache.spark.rdd.RDD
//import groovy.sql.Sql.CreateStatementCommand
//import org.apache.spark.SparkConf

object CountWords  {
  def main(args:Array[String]){
    val objConf = new SparkConf().setAppName("Spark Connection").setMaster("spark://IP:7077")
    var sc = new SparkContext(objConf)
val objHiveContext = new HiveContext(sc)
objHiveContext.sql("USE test")

var test= objHiveContext.sql("show tables")
    var i  = 0
    var testing = test.collect()
      for(i<-0 until testing.length){
      println(testing(i))
    }
  }
}

我已经添加了 spark-core_2.10,spark-catalyst_2.10,spark-sql_2.10,spark-hive_2.10 依赖项 我需要添加更多依赖项吗???

编辑:

<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
  xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
    <modelVersion>4.0.0</modelVersion>
    <groupId>com.Sudhir.Maven1</groupId>
    <artifactId>SparkDemo</artifactId>
    <version>IntervalMeterData1</version>
    <packaging>jar</packaging>
    <name>SparkDemo</name>
    <url>http://maven.apache.org</url>
    <properties>
        <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
        <spark.version>1.5.2</spark.version>
    </properties>
    <dependencies>
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-core_2.10</artifactId>
            <version>1.5.2</version>
        </dependency> 
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-sql_2.10</artifactId>
            <version>1.5.2</version>
        </dependency>
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-catalyst_2.10</artifactId>
            <version>1.5.2</version>
        </dependency>
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-hive_2.10</artifactId>
            <version>1.2.1</version>
        </dependency>
        <dependency>
            <groupId>org.apache.hive</groupId>
            <artifactId>hive-jdbc</artifactId>
            <version>1.2.1</version>
        </dependency>
        <dependency>
            <groupId>junit</groupId>
            <artifactId>junit</artifactId>
            <version>3.8.1</version>
            <scope>test</scope>
        </dependency>     
    </dependencies>
</project>

看起来你忘了撞火花蜂巢:

    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-hive_2.10</artifactId>
        <version>1.5.2</version>
    </dependency>

考虑引入 maven 变量,如 spark.version。

   <properties>
        <spark.version>1.5.2</spark.version>
    </properties>

并通过以下方式修改所有 Spark 依赖项:

   <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-hive_2.10</artifactId>
        <version>${spark.version}</version>
    </dependency>

火花的碰撞版本不会那么痛苦。

仅仅在<properties>中添加属性spark.version是不够的,您必须在依赖项中使用${spark.version}调用它。

相关内容

  • 没有找到相关文章

最新更新