我如何在纱线和HDP上运行Spark 2.2



我试图用HDP 2.6运行Spark 2.2。我从Ambari停止Spark2,然后跑步:

/spark/bin/spark-shell --jars 
/home/ed/.ivy2/jars/stanford-corenlp-3.6.0-models.jar,/home/ed/.ivy2/jars/jersey-bundle-1.19.1.jar --packages 
databricks:spark-corenlp:0.2.0-s_2.11,edu.stanford.nlp:stanford-corenlp:3.6.0 
--master yarn --deploy-mode client --driver-memory 4g --executor-memory 4g --executor-cores 2 --num-executors 11 --conf spark.hadoop.yarn.timeline-service.enabled=false

它曾经运行良好,然后开始给我:

17/12/09 10:16:54 ERROR SparkContext: Error initializing SparkContext. org.apache.spark.SparkException: Yarn application has already ended! It might have been killed or unable to launch application master.

我可以运行好,没有--master yarn --deploy-mode client,但后来我才将驱动程序作为executor。

我尝试了spark.hadoop.yarn.timeline-service.enabled = true

yarn.nodemanager.vmem-check-enabledpmem设置为false。

有人可以帮助我在哪里寻找错误?tia!

ps spark-defaults.conf:

spark.driver.extraLibraryPath /usr/hdp/current/hadoop-client/lib/native:/usr/hdp/current/hadoop-client/lib/native/Linux-amd64-64
spark.eventLog.dir hdfs:///spark2-history/
spark.eventLog.enabled true
spark.executor.extraLibraryPath /usr/hdp/current/hadoop-client/lib/native:/usr/hdp/current/hadoop-client/lib/native/Linux-amd64-64
spark.history.fs.logDirectory hdfs:///spark2-history/
spark.history.kerberos.keytab none
spark.history.kerberos.principal none
spark.history.provider org.apache.spark.deploy.history.FsHistoryProvider
spark.history.ui.port 18081
spark.yarn.historyServer.address master.royble.co.uk:18081
spark.yarn.queue default
spark.yarn.jar=hdfs:///master.royble.co.uk/user/hdfs/sparklib/*.jar
spark.driver.extraJavaOptions -Dhdp.version=2.6.0.3-8
spark.executor.extraJavaOptions -Dhdp.version=2.6.0.3-8
spark.yarn.am.extraJavaOptions -Dhdp.version=2.6.0.3-8

我还尝试了Dhdp.version=修复此处的修复。

升级到HDP 2.6.3,现在起作用。

相关内容

  • 没有找到相关文章

最新更新