K8S Spark using Argo Workflow



我正在为我的spark用例探索argo工作流程。是否有任何示例YAML显示如何使用Argo工作流k8s上执行spark job

apiVersion: argoproj.io/v1alpha1
kind: Workflow
metadata:
generateName: hello-spark-
spec:
entrypoint: sparkapp
templates:
- name: sparkapp
container:
image: sparkimage
command: [sh]
args: [
"-c",
"sh /opt/spark/bin/spark-submit.sh  "--class" "org.apache.spark.examples.SparkPi" "/opt/spark/examples/jars/spark-examples_2.11-2.4.0.jar" "
]

下面是运行 Spark 的 Pi 示例的示例,只需替换 k8s API 的图像、类、url 的正确值

即可
apiVersion: argoproj.io/v1alpha1
kind: Workflow
metadata:
name: wf-spark-pi
namespace: spark
spec:
entrypoint: sparkapp
templates:
- name: sparkapp
container:
image: Spark-Image
imagePullPolicy: Always
command: [sh]
args:
- /opt/spark/bin/spark-submit 
- --master 
- k8s://https://<K8S_API_TCP_ADDR>:<K8S_API_TCP_PORT>
- --deploy-mode
- cluster
- --conf 
- spark.kubernetes.namespace=spark
- --conf
- spark.kubernetes.container.image=Spark-Image
- --conf
- spark.kubernetes.driver.pod.name=spark
- --conf 
- spark.executor.instances=2
- --class
- org.apache.spark.examples.SparkPi
- local:///opt/spark/examples/jars/spark-examples_2.11-2.4.5.jar
resources: {}      
restartPolicy: OnFailure

相关内容

  • 没有找到相关文章

最新更新