k8s Spark Job JAR params



我正在使用下面的清单,应用时我得到下面的错误。这是传递JAR参数的正确方法吗?

apiVersion: batch/v1
kind: Job
metadata:
name: spark-on-eks
spec:
template:
spec:
containers:
- name: spark
image: repo:buildversion
command: [
"/bin/sh",
"-c",
"/opt/spark/bin/spark-submit 
--master k8s://EKSEndpoint 
--deploy-mode cluster 
--name spark-luluapp 
--class com.ll.jsonclass 
--conf spark.jars.ivy=/tmp/.ivy 
--conf spark.kubernetes.container.image=repo:buildversion 
--conf spark.kubernetes.namespace=spark-pi 
--conf spark.kubernetes.authenticate.driver.serviceAccountName=spark-sa 
--conf spark.hadoop.fs.s3a.impl=org.apache.hadoop.fs.s3a.S3AFileSystem 
--conf spark.kubernetes.authenticate.executor.serviceAccountName=spark-sa 
--conf spark.kubernetes.driver.pod.name=spark-job-driver 
--conf spark.executor.instances=4 
local:///opt/spark/examples/App-buildversion-SNAPSHOT.jar 
[mks,env,reg,"dd.mm.yyyy","true","off","db-comp-results","true","XX","XXX","XXXXX","XXX",$$,###] " 
]
serviceAccountName: spark-pi
restartPolicy: Never
backoffLimit: 4

结果错误:

error converting YAML to JSON: yaml: line 33: did not find expected ',' or ']'

您的YAML格式错误。试试这个:

apiVersion: batch/v1
kind: Job
metadata:
name: spark-on-eks
spec:
template:
spec:
containers:
- name: spark
image: repo:buildversion
command:  
- "/bin/sh"
- "-c"
- '/opt/spark/bin/spark-submit 
--master k8s://EKSEndpoint 
--deploy-mode cluster 
--name spark-luluapp 
--class com.ll.jsonclass 
--conf spark.jars.ivy=/tmp/.ivy 
--conf spark.kubernetes.container.image=repo:buildversion 
--conf spark.kubernetes.namespace=spark-pi 
--conf spark.kubernetes.authenticate.driver.serviceAccountName=spark-sa 
--conf spark.hadoop.fs.s3a.impl=org.apache.hadoop.fs.s3a.S3AFileSystem 
--conf spark.kubernetes.authenticate.executor.serviceAccountName=spark-sa 
--conf spark.kubernetes.driver.pod.name=spark-job-driver 
--conf spark.executor.instances=4 
local:///opt/spark/examples/App-buildversion-SNAPSHOT.jar 
[mks,env,reg,"dd.mm.yyyy","true","off","db-comp-results","true","XX","XXX","XXXXX","XXX",$$,###] '
serviceAccountName: spark-pi
restartPolicy: Never
backoffLimit: 4

最新更新