Spark.executor.extrajavaoptions在Spark-Submit中忽略了



我是一个新手,试图介绍本地的火花作业。这是我试图执行的命令,但是我收到警告,说明我的执行人选项被忽略了,因为它们是非spark配置属性。

错误:

警告:忽略非SPARK配置属性:" Spark.executor.extrajavaoptions = Javaagent:Statsd-jvm-profiler-2.1.0-jar-jar-with-with-with-with-with dependencies.jar = server = server = localhost = localhost,port = 8086,记者,database = profiler,用户名= profiler,prospers = profiler,prefix = mynamespace.mysparkapplication,tagmapping = namespace.application'

命令:

./bin/spark-submit -master local [2] - class org.apache.spark.examples.groupbytest -conf-conf" spark.executor.extrajavaoptions = -javaagent:statsd-jvm-jvm-profiler-2.1。-jars/users/shprin/statd/statsd-jvm-profiler-2.1.0-jar-with-dependencies.jar libexec/examples/xpess/jars/spark-examples_2.11-2.3.0.jar

火花版本:2.0.3

请让我知道,如何解决这个问题。

预先感谢。

我认为问题是您用来指定Spark.executor.extrajavaoptions的双引号。它应该是一个报价。

./bin/spark-submit -master local [2] -conf'spark.executor.extrajavaoptions = -Javaagent:statsd-jvm-profiler-2.1.1.0-jar-jar-jar-with-with-with-with-with-with indectencies.jar = server = server = server = server = server = server = server = server = server = server = server =-jars/users/shprin/statd/statsd-jvm-profiler-2.1.0-jar-with-dependencies.jar libexec/examples/xpess/jars/spark-examples_2.11-2.3.0.jar

除了上面的答案,如果您的参数包含空格和单个引号(例如查询参数(,则应将其包含在逃脱的double Quote "

示例:

spark-submit --master yarn --deploy-mode cluster --conf "spark.driver.extraJavaOptions=-DfileFormat=PARQUET -Dquery="select * from bucket where code in ('A')" -Dchunk=yes" spark-app.jar

最新更新