如果 org.apache.hive:hive-service 被放在依赖项中,SparkSession 不起作用



我正在用Java实现一个简单的程序,该程序使用Spark SQL从Parquet文件中读取数据,并构建一个FieldSchema对象的ArrayList(在hive元存储中(,其中每个对象都代表一列及其名称和数据类型。然而,如果导入FieldSchema类,Spark SQL似乎无法共存。

例如,使用如下相同的程序:

import org.apache.spark.sql.SparkSession;
public class main {
public static void main(String[] args) {
SparkSession spark = SparkSession.builder().appName("Application Name").config("spark.master", "local").getOrCreate();
}
}

build.gradle(IntelliJ(中的依赖项配置

dependencies {
testImplementation 'org.junit.jupiter:junit-jupiter-api:5.7.0'
testRuntimeOnly 'org.junit.jupiter:junit-jupiter-engine:5.7.0'
implementation 'org.apache.spark:spark-sql_2.12:3.1.1'
implementation 'org.apache.spark:spark-core_2.12:3.1.1'
}

使程序成功运行。

另一方面,这种依赖关系的配置(为了稍后导入org.apache.hadoop.hive.metastore.api.FieldSchema(

dependencies {
testImplementation 'org.junit.jupiter:junit-jupiter-api:5.7.0'
testRuntimeOnly 'org.junit.jupiter:junit-jupiter-engine:5.7.0'
implementation "org.apache.hive:hive-service:+"
implementation 'org.apache.spark:spark-sql_2.12:3.1.1'
implementation 'org.apache.spark:spark-core_2.12:3.1.1'
}

输出错误

WARNING: An illegal reflective access operation has occurred
An illegal reflective access operation has occurred
WARNING: Illegal reflective access by org.apache.spark.unsafe.Platform (file:/Users/davidtran/.gradle/caches/modules-2/files-2.1/org.apache.spark/spark-unsafe_2.12/3.1.1/1c3b07cb82e71d0519e5222a5ff38758ab499034/spark-unsafe_2.12-3.1.1.jar) to constructor java.nio.DirectByteBuffer(long,int)
WARNING: Please consider reporting this to the maintainers of org.apache.spark.unsafe.Platform
Illegal reflective access by org.apache.spark.unsafe.Platform (file:/Users/davidtran/.gradle/caches/modules-2/files-2.1/org.apache.spark/spark-unsafe_2.12/3.1.1/1c3b07cb82e71d0519e5222a5ff38758ab499034/spark-unsafe_2.12-3.1.1.jar) to constructor java.nio.DirectByteBuffer(long,int)
Please consider reporting this to the maintainers of org.apache.spark.unsafe.Platform
WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
WARNING: All illegal access operations will be denied in a future release
Use --illegal-access=warn to enable warnings of further illegal reflective access operations
All illegal access operations will be denied in a future release
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
Exception in thread "main" java.lang.NoSuchFieldError: JAVA_9
at org.apache.spark.storage.StorageUtils$.<init>(StorageUtils.scala:207)
at org.apache.spark.storage.StorageUtils$.<clinit>(StorageUtils.scala)
at org.apache.spark.storage.BlockManagerMasterEndpoint.<init>(BlockManagerMasterEndpoint.scala:109)
at org.apache.spark.SparkEnv$.$anonfun$create$9(SparkEnv.scala:371)
at org.apache.spark.SparkEnv$.registerOrLookupEndpoint$1(SparkEnv.scala:311)
at org.apache.spark.SparkEnv$.create(SparkEnv.scala:359)
at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:189)
at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:277)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:458)
at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2678)
at org.apache.spark.sql.SparkSession$Builder.$anonfun$getOrCreate$2(SparkSession.scala:942)
at scala.Option.getOrElse(Option.scala:189)
at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:936)
at main.main(main.java:6)

我通过从org.apache.hadop:hadoop-common pom.xml依赖列表中排除org.apache.commons:commons-lang3,修复了以下错误。

java.lang.NoSuchFieldError:java_9

最新更新