加载类文件时检测到缺少或无效的依赖项'SQLTestUtilsBase.class'



>我正在尝试为Spark scala代码编写单元测试,我发现了这篇文章:如何在Spark 2.0+中编写单元测试? 但是,当我添加这些依赖项时,编译时出现此错误:

Error:scalac: missing or invalid dependency detected while loading class file 'SQLTestUtilsBase.class'.
Could not access type PlanTestBase in package org.apache.spark.sql.catalyst.plans,
because it (or its dependencies) are missing. Check your build definition for
missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see the problematic classpath.)
A full rebuild may help if 'SQLTestUtilsBase.class' was compiled against an incompatible version of org.apache.spark.sql.catalyst.plans.

我尝试用-Ylog-classpath重新运行,但没有帮助。这就是我认为 maven 构建的相关 pom 行:

<!-- #### SPARK #### -->
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>2.11.8</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.11</artifactId>
<version>2.11.8</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-hive_2.11</artifactId>
<version>2.11.8</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming_2.11</artifactId>
<version>2.11.8</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming-kafka-0-10_2.11</artifactId>
<version>2.11.8</version>
</dependency>
<!-- #### SPARK #### -->
<!-- #### Testing #### -->
<dependency>
<groupId>org.scalatest</groupId>
<artifactId>scalatest_2.11</artifactId>
<version>3.0.8</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.scalamock</groupId>
<artifactId>scalamock_2.11</artifactId>
<version>4.1.0</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.scalactic</groupId>
<artifactId>scalactic_2.11</artifactId>
<version>3.0.8</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>2.11.8</version>
<type>test-jar</type>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.11</artifactId>
<version>2.11.8</version>
<type>test-jar</type>
<scope>test</scope>
</dependency>
<!-- #### Testing #### -->

是什么导致了这场冲突?

通过添加spark-catalyst依赖项来解决。SQLTestUtilsBase特征扩展PlanTestBase,并在spark-catalyst包中定义。

而且,我认为您应该将 spark 版本2.11.8更改为 maven repo 中的其他人。

<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-catalyst_2.11</artifactId>
<version>2.4.0</version>
<type>test-jar</type>
<scope>test</scope>
</dependency>

看:

  • https://mvnrepository.com/artifact/org.apache.spark/spark-core
  • org/apache/spark/sql/test/SQLTestUtils.scala#L222-L226
  • org/apache/spark/sql/catalyst/plans/PlanTest.scala

相关内容

  • 没有找到相关文章

最新更新