我已经在eclipse中编写了Java程序来连接spark中的两个表,但是我在package
附近得到了一个错误下面是错误
scale .reflect.api的类型。$TypeTag无法解析。它是从必需的.class文件间接引用
这是我写的程序
package joins;
import org.apache.spark.SparkConf;
import org.apache.spark.SparkContext;
import org.apache.spark.api.java.JavaSparkContext;
import org.apache.spark.sql.DataFrame;
import org.apache.spark.sql.hive.HiveContext;
public class Spark {
public static void main(String[] args) {
// TODO Auto-generated method stub
SparkConf conf = new SparkConf();
SparkContext sc = new SparkContext(conf);
HiveContext sqlContext = new org.apache.spark.sql.hive.HiveContext(sc);
sqlContext.sql(" use myown ");
DataFrame table_01 = sqlContext.sql("SELECT * FROM customer");
table_01.saveAsTable("spark_table_01");
sqlContext.cacheTable("spark_table_01");
DataFrame table_02 = sqlContext.sql("SELECT * FROM account");
table_02.saveAsTable("spark_table_02");
sqlContext.cacheTable("spark_table_02");
DataFrame table_join = sqlContext.sql(" SELECT a.* FROM customer a join account b on a.ssn=b.ssn ");
table_join.insertInto("customeraccount");
sqlContext.uncacheTable("spark_table_01");
sqlContext.uncacheTable("spark_table_02");
}
}
似乎你在应用程序中缺少scala reflect.jar文件。下载scala-reflect.jar,将其放在类路径中并重新编译。
在Eclipse中,我通过添加scala-reflect-2.11.8.jar作为外部jar文件来解决这个问题。jar文件可以在Spark目录的"jars"文件夹中找到。我使用Spark 2.1.0