Pyspark集群模式异常-Java网关进程在向驱动程序发送其端口号之前退出



在apache airflow中,我编写了一个PythonOperator,它使用pyspark在yarn集群模式下运行作业。我初始化sparksession对象如下。

spark = SparkSession 
.builder 
.appName("test python operator") 
.master("yarn") 
.config("spark.submit.deployMode","cluster") 
.getOrCreate()

然而,当我运行dag时,我会得到一个异常。

Traceback (most recent call last):
File "/usr/local/lib/python3.8/dist-packages/airflow/models/taskinstance.py", line 983, in _run_raw_task
result = task_copy.execute(context=context)
File "/usr/local/lib/python3.8/dist-packages/airflow/operators/python_operator.py", line 113, in execute
return_value = self.execute_callable()
File "/usr/local/lib/python3.8/dist-packages/airflow/operators/python_operator.py", line 118, in execute_callable
return self.python_callable(*self.op_args, **self.op_kwargs)
File "/catfish/dags/dags_dag_test_python_operator.py", line 39, in print_count
spark = SparkSession 
File "/usr/local/lib/python3.8/dist-packages/pyspark/sql/session.py", line 186, in getOrCreate
sc = SparkContext.getOrCreate(sparkConf)
File "/usr/local/lib/python3.8/dist-packages/pyspark/context.py", line 371, in getOrCreate
SparkContext(conf=conf or SparkConf())
File "/usr/local/lib/python3.8/dist-packages/pyspark/context.py", line 128, in __init__
SparkContext._ensure_initialized(self, gateway=gateway, conf=conf)
File "/usr/local/lib/python3.8/dist-packages/pyspark/context.py", line 320, in _ensure_initialized
SparkContext._gateway = gateway or launch_gateway(conf)
File "/usr/local/lib/python3.8/dist-packages/pyspark/java_gateway.py", line 105, in launch_gateway
raise Exception("Java gateway process exited before sending its port number")
Exception: Java gateway process exited before sending its port number

我还设置了PYSPARK_SUBMT_ARGS,但它对我不起作用!

您需要在您的ubuntu容器上安装spark。

RUN apt-get -y install default-jdk scala git curl wget
RUN wget --no-verbose https://downloads.apache.org/spark/spark-2.4.6/spark-2.4.6-bin-hadoop2.7.tgz
RUN tar xvf spark-2.4.6-bin-hadoop2.7.tgz
RUN mv spark-2.4.6-bin-hadoop2.7 /opt/spark
ENV SPARK_HOME=/opt/spark

不幸的是,您无法使用PythonOperator在纱线上运行spark。我建议您使用SparkSubmitOperator或BashOperator。

最新更新