如何将 Spark 安装为守护程序



我按照本指南在两台机器上以主站和从机的身份开始 Spark:
https://www.tutorialkart.com/apache-spark/how-to-setup-an-apache-spark-cluster/
然后我为每个机器制作 systemd .service,但是当我将它们作为服务启动时,它们无法启动。这是我的系统CTL状态:

● sparkslave.service - Spark Slave
Loaded: loaded (/etc/systemd/system/sparkslave.service; enabled; ven
dor preset: enabled)
Active: inactive (dead) since Mon 2019-12-09 07:30:22 EST; 55s ago
Process: 31680 ExecStart=/usr/lib/spark/sbin/start-slave.sh spark://1
72.16.3.90:7077 (code=exited, status=0/SUCCESS)
Main PID: 31680 (code=exited, status=0/SUCCESS)
Dec 09 07:30:19 SparkSlave1 systemd[1]: Started Spark Slave.
Dec 09 07:30:19 SparkSlave1 start-slave.sh[31680]: starting org.apache.
spark.deploy.worker.Worker, logging to /usr/lib/spark/logs/spark-spark-
user-org.apache.spark.deploy.worker.Worker-1-SparkSlave1.out

这是我的sparkslave.service:

[Unit]
Description=Spark Slave
After=network.target
[Service]
User=spark-user
WorkingDirectory=/usr/lib/spark/sbin
ExecStart=/usr/lib/spark/sbin/start-slave.sh spark://172.16.3.90:7077
Restart=on-failure
RestartSec=10s
[Install]
WantedBy=multi-user.target

问题出在哪里?

服务类型必须从简单更改为分叉:

[Service]
Type=forking

最新更新