将spark-submit命令(用于spark-app的dotnet)转换为用于python-app的spark-sub



如果以下(正在工作的(spark-submit命令(用于spark应用程序的dotnet(正在执行python脚本,它还会使用相同的--conf设置吗?给定一个没有定义函数(除了main(的python脚本名myapp.py,那么python脚本的--class引用是什么?

/opt/spark/bin/spark-submit --class org.apache.spark.deploy.dotnet.DotnetRunner 
--conf "spark.eventLog.enabled=true" 
--conf "spark.eventLog.dir=file:/usr/bin/spark/hadoop/logs" 
--master spark://spark:7077 
/opt/spark/jars/microsoft-spark-3-1_2.12-2.0.0.jar 
dotnet myapp.dll "somefilename.txt"

对于Python应用程序,只需传递一个.py文件,无需提及类名

/opt/spark/bin/spark-submit     
--conf "spark.eventLog.enabled=true" 
--conf "spark.eventLog.dir=file:/usr/bin/spark/hadoop/logs"    
--master spark://spark:7077 
/your python file path/myapp.py

欲了解更多信息,请参阅https://spark.apache.org/docs/latest/submitting-applications.html

最新更新