我试图使用./Documents/spark/spark-3.0.0-bin-hadoop2.7/bin/spark-shell
从 MacOS 上的终端运行 spark-shell,它开始运行,但我想只用spark-shell
运行它。
我看了一个 4 分钟的视频,展示了它是如何完成的,但它对我不起作用。
我不完全了解~/.bash_profile
是如何工作的,但下面是它的样子:
# added by Anaconda3 5.3.1 installer
# >>> conda init >>>
# !! Contents within this block are managed by 'conda init' !!
__conda_setup="$(CONDA_REPORT_ERRORS=false '/Users/ajay/anaconda3/bin/conda' shell.bash hook 2> /dev/null)"
if [ $? -eq 0 ]; then
eval "$__conda_setup"
else
if [ -f "/Users/ajay/anaconda3/etc/profile.d/conda.sh" ]; then
. "/Users/ajay/anaconda3/etc/profile.d/conda.sh"
CONDA_CHANGEPS1=false conda activate base
else
export PATH="/Users/ajay/anaconda3/bin:$PATH"
fi
fi
unset __conda_setup
# <<< conda init <<<
export SPARK_HOME=/Users/ajay/Documents/spark/spark-3.0.0-bin-hadoop2.7
export PATH=$PATH:$SPARK_HOME/bin
PATH="/usr/local/bin:/Users/ajay/anaconda3/bin:/usr/local/bin:/usr/bin:/bin:/usr/sbin:/sbin"
$PATH给/usr/local/bin:/Users/ajay/anaconda3/bin:/usr/local/bin:/usr/bin:/bin:/usr/sbin:/sbin
我需要如何更改spark-shell
工作的~/.bash_profile
?
编辑
这是我在运行./Documents/spark/spark-3.0.0-bin-hadoop2.7/bin/spark-shell
时收到的消息
20/08/27 16:51:16 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
Spark context Web UI available at http://192.168.0.2:4040
Spark context available as 'sc' (master = local[*], app id = local-1598527288778).
Spark session available as 'spark'.
在运行spark-shell
时,它显示:-bash: spark-shell: command not found
export PATH=$PATH:$SPARK_HOME/bin
PATH="/usr/local/bin:/Users/ajay/anaconda3/bin:/usr/local/bin:/usr/bin:/bin:/usr/sbin:/sbin"
这两行的顺序错误,因为您将火花安装"添加"到$PATH
,然后立即覆盖$PATH
。
您可能更喜欢以下内容:
export PATH="$SPARK_HOME/bin:/Users/ajay/anaconda3/bin:/usr/local/bin:$PATH"
不要忘记,对.bash_profile
、.profile
、.bashrc
的任何更改只会在新 shell 中生效(除非您手动加载它们)。