Support Questions

Find answers, ask questions, and share your expertise

Spark submit multiple configurations

avatar

The docs here same to place it in key value format https://spark.apache.org/docs/1.6.1/running-on-yarn.html

But unfortunately did not give a clear example I want to have 2 configurations set.

spark-submit --conf "spark.hadoop.parquet.enable.summary-metadata=false;spark.yarn.maxAppAttempts=1" etc..

Is this the correct way of doing it and if not what would be the correct way.

I am using spark 1.6

Thank you

1 ACCEPTED SOLUTION

avatar
Contributor
@elliot gimple

the correct way to pass multiple configuration options is to specify them individually. The following should work for your example:

spark-submit --conf spark.hadoop.parquet.enable.summary-metadata=false --conf spark.yarn.maxAppAttempts=1

As always if you like the answer please up vote the answer.

View solution in original post

4 REPLIES 4

avatar
Contributor
@elliot gimple

the correct way to pass multiple configuration options is to specify them individually. The following should work for your example:

spark-submit --conf spark.hadoop.parquet.enable.summary-metadata=false --conf spark.yarn.maxAppAttempts=1

As always if you like the answer please up vote the answer.

avatar

I had one more question if I need the arguments to be in quotes then --conf "A" --conf "B" for the arguments doesnt work. Just curious if you happen to know how pass two arguments in quotes for the spark submit.

Thanks for the answer :).

avatar
Contributor

I believe single quote should work. Try --conf 'some.config' --conf 'other.config'.

avatar
Cloudera Employee

Hi,

 

The correct way to pass the multiple configurations is that it should be passed along with the --conf.

Ex:

spark-submit --conf org.spark.metadata =false --conf spark.driver.memory=10gb

 

Thanks

AK