Spark 和 Cassandra Java application: Exception in thread "main" java.lang.NoClassDefFoundError: org/a



我得到了一个非常简单的java应用程序,我几乎是从这个例子中复制的:http://markmail.org/download.xqy?id=zua6upabiylzeetp&number=2

我所要做的就是读取表数据并在Eclipse控制台中显示。

我pom.xml:

<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>chat_connaction_test</groupId>
<artifactId>ChatSparkConnectionTest</artifactId>
<version>0.0.1-SNAPSHOT</version>
<dependencies> 
<dependency>
<groupId>com.datastax.cassandra</groupId>
<artifactId>cassandra-driver-core</artifactId>
<version>3.1.0</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.10</artifactId>
<version>2.0.0</version>
</dependency>
<!-- https://mvnrepository.com/artifact/com.datastax.spark/spark-cassandra-connector_2.10 -->
<dependency>
<groupId>com.datastax.spark</groupId>
<artifactId>spark-cassandra-connector_2.10</artifactId>
<version>2.0.0-M3</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.spark/spark-streaming_2.10 -->
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming_2.10</artifactId>
<version>2.0.0</version>
</dependency>
<!--
<dependency> 
<groupId>org.apache.spark</groupId> 
<artifactId>spark-hive_2.10</artifactId> 
<version>1.5.2</version> 
</dependency>
-->
</dependencies>
</project>

和我的java代码:

package com.chatSparkConnactionTest;
import static com.datastax.spark.connector.japi.CassandraJavaUtil.javaFunctions;
import java.io.Serializable;
import org.apache.spark.SparkConf;
import org.apache.spark.api.java.JavaRDD;
import org.apache.spark.api.java.JavaSparkContext;
import org.apache.spark.api.java.function.Function;
import com.datastax.spark.connector.japi.CassandraRow;
public class JavaDemo implements Serializable {
private static final long serialVersionUID = 1L;
public static void main(String[] args) {
SparkConf conf = new SparkConf().
setAppName("chat").
setMaster("local").
set("spark.executor.memory","1g").
set("spark.cassandra.connection.host", "127.0.0.1");
JavaSparkContext sc = new JavaSparkContext(conf);
JavaRDD<String> cassandraRowsRDD = javaFunctions(sc).cassandraTable(
"chat", "dictionary")
.map(new Function<CassandraRow, String>() {
@Override
public String call(CassandraRow cassandraRow) throws Exception {
String tempResult = cassandraRow.toString();
System.out.println(tempResult);
return tempResult;
}
}
);
System.out.println("Data as CassandraRows: n" + 
cassandraRowsRDD.collect().size()); // THIS IS A LINE WITH ERROR
} 
}

这里是我的错误:

16/10/05 20:49:18 INFO CassandraConnector: Connected to Cassandra在线程"main"中测试集群异常java.lang.NoClassDefFoundError: org/apache/spark/sql/Dataset at. lang . class。getdeclardmethod50(本机方法). lang . class。privateGetDeclaredMethods(来源未知. lang . class。getDeclaredMethod(来源未知java.io.ObjectStreamClass。getprivatemmethod(来源未知)atjava.io.ObjectStreamClass。访问1700美元(来源不明)java.io.ObjectStreamClass$2.run(来源未知)atjava.io.ObjectStreamClass$2.run(来源未知)atjava.security.AccessController。特权(本地方法)在java.io.ObjectStreamClass。(来源不明)在java.io.ObjectStreamClass。查找(未知来源)在java.io.ObjectOutputStream。writeObject0(未知来源)atjava.io.ObjectOutputStream.defaultWriteFields(未知来源)atjava.io.ObjectOutputStream。writeSerialData(未知来源)atjava.io.ObjectOutputStream。writeOrdinaryObject(未知来源)atjava.io.ObjectOutputStream。writeObject0(未知来源)atjava.io.ObjectOutputStream.defaultWriteFields(未知来源)atjava.io.ObjectOutputStream。writeSerialData(未知来源)atjava.io.ObjectOutputStream。writeOrdinaryObject(未知来源)atjava.io.ObjectOutputStream。writeObject0(未知来源)atjava.io.ObjectOutputStream。writeObject(未知来源)atscala.collection.immutable。结肠colon.writeObject美元(List.scala: 379)在sun.reflect.NativeMethodAccessorImpl。invoke0(本机方法)atsun.reflect.NativeMethodAccessorImpl。调用(未知源)atsun.reflect.DelegatingMethodAccessorImpl。调用(未知源)atjava.lang.reflect.Method。调用(未知源)atjava.io.ObjectStreamClass。invokeWriteObject(未知源)java.io.ObjectOutputStream。writeSerialData(未知来源)atjava.io.ObjectOutputStream。writeOrdinaryObject(未知来源)atjava.io.ObjectOutputStream。writeObject0(未知来源)atjava.io.ObjectOutputStream.defaultWriteFields(未知来源)atjava.io.ObjectOutputStream。writeSerialData(未知来源)atjava.io.ObjectOutputStream。writeOrdinaryObject(未知来源)atjava.io.ObjectOutputStream。writeObject0(未知来源)atjava.io.ObjectOutputStream.defaultWriteFields(未知来源)atjava.io.ObjectOutputStream。writeSerialData(未知来源)atjava.io.ObjectOutputStream。writeOrdinaryObject(未知来源)atjava.io.ObjectOutputStream。writeObject0(未知来源)atjava.io.ObjectOutputStream.defaultWriteFields(未知来源)atjava.io.ObjectOutputStream。writeSerialData(未知来源)atjava.io.ObjectOutputStream。writeOrdinaryObject(未知来源)atjava.io.ObjectOutputStream。writeObject0(未知来源)atjava.io.ObjectOutputStream。writeObject(未知来源)atorg.apache.spark.serializer.JavaSerializationStream.writeObject (JavaSerializer.scala: 43)在org.apache.spark.serializer.JavaSerializerInstance.serialize (JavaSerializer.scala: 100)在org.apache.spark.util.ClosureCleaner .ensureSerializable美元(ClosureCleaner.scala: 295)在org.apache.spark.util.ClosureCleaner .org apache引发美元美元util ClosureCleaner美元美元清洁(ClosureCleaner.scala: 288)在org.apache.spark.util.ClosureCleaner .clean美元(ClosureCleaner.scala: 108)(SparkContext.scala:2037org.apache.spark.SparkContext.runJob (SparkContext.scala: 1896)org.apache.spark.SparkContext.runJob (SparkContext.scala: 1911)org.apache.spark.rdd.RDD anonfun收集美元1.美元(RDD.scala: 893)org.apache.spark.rdd.RDDOperationScope .withScope美元(RDDOperationScope.scala: 151)在org.apache.spark.rdd.RDDOperationScope .withScope美元(RDDOperationScope.scala: 112)(RDD.scala:358org.apache.spark.rdd.RDD.collect (RDD.scala: 892)org.apache.spark.api.java.JavaRDDLike class.collect美元(JavaRDDLike.scala: 360)在org.apache.spark.api.java.AbstractJavaRDDLike.collect (JavaRDDLike.scala: 45)at com.chatSparkConnactionTest.JavaDemo.main(JavaDemo.java:37)导致org.apache.spark.sql.Dataset atfindclass(未知来源java.lang.ClassLoader。loadClass(未知源)sun.misc.Launcher AppClassLoader美元。loadClass(未知源)java.lang.ClassLoader。loadClass(未知来源)…58更多

我更新了pom.xml,但没有解决错误。有人能帮我解决这个问题吗?

谢谢!

更新1:这是我的构建路径截图:链接到我的截图

您得到的是"java.lang. "NoClassDefFoundError: org/apache/spark/sql/Dataset"错误,因为"spark-sql"依赖关系从您的pom.xml文件中丢失。

如果你想用Spark 2.0.0读取Cassandra表,那么你需要低于最小依赖。

<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>2.0.0</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.11</artifactId>
<version>2.0.0</version>
</dependency>
<dependency>
<groupId>com.datastax.spark</groupId>
<artifactId>spark-cassandra-connector_2.11</artifactId>
<version>2.0.0-M3</version>
</dependency>

Spark 2.0.0提供了SparkSession和Dataset API。下面是读取Cassandra表并打印记录的示例程序。

public class SparkCassandraDatasetApplication {
public static void main(String[] args) {
SparkSession spark = SparkSession
.builder()
.appName("SparkCassandraDatasetApplication")
.config("spark.sql.warehouse.dir", "/file:C:/temp")
.config("spark.cassandra.connection.host", "127.0.0.1")
.config("spark.cassandra.connection.port", "9042")
.master("local[2]")
.getOrCreate();
//Read data
Dataset<Row> dataset = spark.read().format("org.apache.spark.sql.cassandra")
.options(new HashMap<String, String>() {
{
put("keyspace", "mykeyspace");
put("table", "mytable");
}
}).load();
//Print data
dataset.show();       
spark.stop();
}        
}

如果你仍然想使用RDD,那么使用下面的示例程序。

public class SparkCassandraRDDApplication {
public static void main(String[] args) {
SparkConf conf = new SparkConf()
.setAppName("SparkCassandraRDDApplication")
.setMaster("local[2]")
.set("spark.cassandra.connection.host", "127.0.0.1")
.set("spark.cassandra.connection.port", "9042");
JavaSparkContext sc = new JavaSparkContext(conf);
//Read
JavaRDD<UserData> resultsRDD = javaFunctions(sc).cassandraTable("mykeyspace", "mytable",CassandraJavaUtil.mapRowTo(UserData.class));
//Print
resultsRDD.foreach(data -> {
System.out.println(data.id);
System.out.println(data.username);
});
sc.stop();
}
}

Javabean (UserData)在上面的程序中使用如下:

public class UserData implements Serializable{  
String id;
String username;     
public String getId() {
return id;
}
public void setId(String id) {
this.id = id;
}
public String getUsername() {
return username;
}
public void setUsername(String username) {
this.username = username;
}    
}

我认为您需要确保在您的类路径中存在以下资源:

cassandra-driver-core-2.1.0.jar
metrics-core-3.0.2.jar
slf4j-api-1.7.5.jar
netty-3.9.0-Final.jar
guava-16.0.1.jar

希望这对你有帮助

删除

<!-- https://mvnrepository.com/artifact/com.datastax.spark/spark-cassandra-connector-java_2.10 -->
<dependency>
<groupId>com.datastax.spark</groupId>
<artifactId>spark-cassandra-connector-java_2.10</artifactId>
<version>1.6.0-M1</version>
</dependency>

在类路径上混合了不同的版本。java模块包含在Spark Cassandra Connector 2.0.0的核心模块中。所以这只是在spark 1.6引用。

最新更新