在执行Spark提交期间,在Spark类路径中加载属性文件



我正在Spark-submit脚本中安装Spark-Atlas连接器(https://github.com/hortonworks-spark/spark-atlas-connector)由于安全限制,我不能将atlas-application.properties放在spark/conf存储库中。

我在spark提交中使用了两个选项:

--driver-class-path  "spark.driver.extraClassPath=hdfs:///directory_to_properties_files" 
--conf "spark.executor.extraClassPath=hdfs:///directory_to_properties_files" 

当我启动spark提交时,我遇到了这个问题:

20/07/20 11:32:50 INFO ApplicationProperties: Looking for atlas-application.properties in classpath
20/07/20 11:32:50 INFO ApplicationProperties: Looking for /atlas-application.properties in classpath
20/07/20 11:32:50 INFO ApplicationProperties: Loading atlas-application.properties from null

请查找CDP Atals Configuration文章。

https://community.cloudera.com/t5/Community-Articles/How-to-pass-atlas-application-properties-configuration-file/ta-p/322158

客户端模式:

spark-submit --class org.apache.spark.examples.SparkPi --master yarn --deploy-mode client --driver-java-options="-Datlas.conf=/tmp/" /opt/cloudera/parcels/CDH/jars/spark-examples*.jar 10

集群模式:

sudo -u spark spark-submit --class org.apache.spark.examples.SparkPi --master yarn --deploy-mode cluster --files /tmp/atlas-application.properties --conf spark.driver.extraJavaOptions="-Datlas.conf=./" /opt/cloudera/parcels/CDH/jars/spark-examples*.jar 10

相关内容

  • 没有找到相关文章

最新更新