Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

Spark submit multiple configurations

avatar

The docs here same to place it in key value format https://spark.apache.org/docs/1.6.1/running-on-yarn.html

But unfortunately did not give a clear example I want to have 2 configurations set.

spark-submit --conf "spark.hadoop.parquet.enable.summary-metadata=false;spark.yarn.maxAppAttempts=1" etc..

Is this the correct way of doing it and if not what would be the correct way.

I am using spark 1.6

Thank you

1 ACCEPTED SOLUTION

avatar
Contributor
hide-solution

This problem has been solved!

Want to get a detailed solution you have to login/registered on the community

Register/Login
4 REPLIES 4

avatar
Contributor
hide-solution

This problem has been solved!

Want to get a detailed solution you have to login/registered on the community

Register/Login

avatar

I had one more question if I need the arguments to be in quotes then --conf "A" --conf "B" for the arguments doesnt work. Just curious if you happen to know how pass two arguments in quotes for the spark submit.

Thanks for the answer :).

avatar
Contributor

I believe single quote should work. Try --conf 'some.config' --conf 'other.config'.

avatar
Cloudera Employee

Hi,

 

The correct way to pass the multiple configurations is that it should be passed along with the --conf.

Ex:

spark-submit --conf org.spark.metadata =false --conf spark.driver.memory=10gb

 

Thanks

AK