我无法在 Kubernetes 上运行自定义 Spark 应用程序。
我已经按照设置 https://spark.apache.org/docs/latest/running-on-kubernetes.html 和示例步骤进行操作,如下所示:https://towardsdatascience.com/how-to-build-spark-from-source-and-deploy-it-to-a-kubernetes-cluster-in-60-minutes-225829b744f9 我可以运行 spark-pi 示例。
我什至重新创建了 spark 映像,并让它在/opt/spark/examples/jars 和/opt/spark/jars 中包含我的 xxx.jar,但仍然遇到加载类失败的问题。我可能错过的任何想法。这对我来说特别令人困惑,因为我检查了罐子是否是示例罐子旁边的图像的一部分,并且它们工作正常。
我像这样运行火花提交。
bin/spark-submit
--master k8s://https://localhost:6443
--deploy-mode cluster
--conf spark.executor.instances=3
--conf spark.kubernetes.authenticate.driver.serviceAccountName=spark
--conf spark.kubernetes.container.image=spark2:latest
--conf spark.kubernetes.container.image.pullPolicy=Never
--class com.xxx.Application
--name myApp
local:///opt/spark/examples/jars/xxx.jar
更新:添加了堆栈跟踪:
spark.driver.bindAddress=10.1.0.68 --deploy-mode client --properties- file /opt/spark/conf/spark.properties --class com.xxx.Application local:///opt/spark/examples/jars/xxx.jar
20/05/13 11:02:09 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Error: Failed to load class com.xxx.Application.
log4j:WARN No appenders could be found for logger (org.apache.spark.util.ShutdownHookManager).
谢谢!
使用jar xf <jar name>.jar
验证您引用的 jar 在com.xxx.Application.class
Application.class
文件。 取代
local:///opt/spark/examples/jars/xxx.jar
跟
/opt/spark/examples/jars/xxx.jar
在spark-submit
命令中