Spark airserver无法使用Spark 2.0构建



我试图用Spark-2.0运行Spark-Jobserver我从GitHub存储库中克隆了Branch Spark-2.0-preview。我遵循部署指南,但是当我尝试使用 bin/server_deploy.sh 部署服务器时。我有汇编错误:

 Error:
[error] /spark-jobserver/job-server-extras/src/main/java/spark/jobserver/JHiveTestLoaderJob.java:4: cannot find symbol
[error] symbol: class DataFrame
[error] location: package org.apache.spark.sql
[error] import org.apache.spark.sql.DataFrame;
[error] /spark-jobserver/job-server-extras/src/main/java/spark/jobserver/JHiveTestJob.java:13: java.lang.Object cannot be converted to org.apache.spark.sql.Row[]
[error] return sc.sql(data.getString("sql")).collect();
[error] /spark-jobserver/job-server-extras/src/main/java/spark/jobserver/JHiveTestLoaderJob.java:25: cannot find symbol
[error] symbol: class DataFrame
[error] location: class spark.jobserver.JHiveTestLoaderJob
[error] final DataFrame addrRdd = sc.sql("SELECT * FROM default.test_addresses");
[error] /spark-jobserver/job-server-extras/src/main/java/spark/jobserver/JSqlTestJob.java:13: array required, but java.lang.Object found
[error] Row row = sc.sql("select 1+1").take(1)[0];
[info] /spark-jobserver/job-server-extras/src/main/java/spark/jobserver/JHiveTestJob.java: Some input files use or override a deprecated API.
[info] /spark-jobserver/job-server-extras/src/main/java/spark/jobserver/JHiveTestJob.java: Recompile with -Xlint:deprecation for details.
[error] (job-server-extras/compile:compileIncremental) javac returned nonzero exit code

我是否忘记了添加一些依赖关系?

我也有类似的问题。我发现这是错误,因为Spark API从1.x变为2.x。您可以在github https://github.com/spark-jobserver/spark-jobserver/SISSUES/760

上找到空旷的问题。

我介绍了一些快速修复程序,这对我解决了问题,我可以部署工作人员。我提交了拉的请求。https://github.com/spark-jobserver/spark-jobserver/pull/762

最新更新