SQLcontext版本与spark-core一致



我正在Scala-IDE中为Spark-core和Dataframes构建我的工作空间。下面是pom.xml

中提供的端点
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.10</artifactId>
<version>1.6.2</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.spark/spark-sql_2.11  -->

<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.11</artifactId>
<version>1.6.1</version>
</dependency>

看起来版本不兼容。spark-core-1.6.2使用哪个版本的sqlContext ?

我会这样写:

<dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-sql_2.11</artifactId>
    <version>1.6.2</version>
</dependency>
<dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-core_2.11</artifactId>
    <version>1.6.2</version>
</dependency>

我会在两个依赖中使用相同的scala版本!您可以在依赖项中指定scala 2.10用于spark core, 2.11用于spark sql。目前Spark Core 1.6.3可用,但Spark SQL 1.6.3不可用。但是当它可用时,您应该将您的浏览器更改为它,因为他们已经发布了许多错误修复。

    <scala.version>2.11.7</scala.version>
    <scala.compat.version>2.11</scala.compat.version>
    <spark.sql.version>1.6.0</spark.sql.version>
    <spark.core.version>1.6.0</spark.core.version>
  <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-sql_${scala.compat.version}</artifactId>
        <version>${spark.sql.version}</version>
    </dependency>
    <!-- https://mvnrepository.com/artifact/org.apache.spark/spark-core -->
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-core_${scala.compat.version}</artifactId>
        <version>${spark.core.version}</version>
    </dependency>

相关内容

  • 没有找到相关文章