py4j.protocol.py4jerror:呼叫无.跟踪:身份验证错误:意外命令



我最近安装了Spark 2.4.3,并在尝试运行Pyspark并且不知道如何修复时获得以下例外:

Traceback (most recent call last):
  File "/usr/local/Cellar/apache-spark/2.4.3/libexec//python/pyspark/shell.py", line 38, in <module>
    SparkContext._ensure_initialized()
  File "/usr/local/Cellar/apache-spark/2.4.3/libexec/python/pyspark/context.py", line 316, in _ensure_initialized
    SparkContext._gateway = gateway or launch_gateway(conf)
  File "/usr/local/Cellar/apache-spark/2.4.3/libexec/python/pyspark/java_gateway.py", line 46, in launch_gateway
    return _launch_gateway(conf)
  File "/usr/local/Cellar/apache-spark/2.4.3/libexec/python/pyspark/java_gateway.py", line 139, in _launch_gateway
    java_import(gateway.jvm, "org.apache.spark.SparkConf")
  File "/Library/Python/2.7/site-packages/py4j-0.10.4-py2.7.egg/py4j/java_gateway.py", line 175, in java_import
    return_value = get_return_value(answer, gateway_client, None, None)
  File "/Library/Python/2.7/site-packages/py4j-0.10.4-py2.7.egg/py4j/protocol.py", line 323, in get_return_value
    format(target_id, ".", name, value))
py4j.protocol.Py4JError: An error occurred while calling None.None. Trace:
Authentication error: unexpected command.

spark本身的一切似乎都很正常,因为当我运行spark-submit --version时,我会得到:

Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _ / _ / _ `/ __/  '_/
   /___/ .__/_,_/_/ /_/_   version 2.4.3
      /_/
Using Scala version 2.11.12, Java HotSpot(TM) 64-Bit Server VM, 1.8.0_131
Branch 
Compiled by user  on 2019-05-01T05:08:38Z
Revision 
Url 
Type --help for more information.

另外,这是我的~/.bash_profile的样子:

export JAVA_HOME=/Library/Java/JavaVirtualMachines/jdk1.8.0_131.jdk/Contents/Home
export PATH=$PATH:/Users/ahajibagheri/Library/Python/2.7/bin
if which pyspark > /dev/null; then
    export SPARK_HOME="/usr/local/Cellar/apache-spark/2.4.3/libexec/"
    export PYTHONPATH=$SPARK_HOME/python:$SPARK_HOME/python/build:$PYTHONPATH
    export PYTHONPATH=$SPARK_HOME/python/lib/py4j-0.10.4-src.zip:$PYTHONPATH
fi

选择py4j.0.10.8.1.zip and pyspark.zip in python/lib in设置/project structions/project结构/添加您在此处下载的内容root/spark文件夹。

最新更新