Archives of Support Questions (Read Only)

This is an archived board for historical reference. Information and links may no longer be available or relevant
Announcements
This board is archived and read-only for historical reference. To ask a new question, please post a new topic on the appropriate active board.

Spark submit multiple configurations

avatar
Not applicable

The docs here same to place it in key value format https://spark.apache.org/docs/1.6.1/running-on-yarn.html

But unfortunately did not give a clear example I want to have 2 configurations set.

spark-submit --conf "spark.hadoop.parquet.enable.summary-metadata=false;spark.yarn.maxAppAttempts=1" etc..

Is this the correct way of doing it and if not what would be the correct way.

I am using spark 1.6

Thank you

1 ACCEPTED SOLUTION

avatar
New Member
@elliot gimple

the correct way to pass multiple configuration options is to specify them individually. The following should work for your example:

spark-submit --conf spark.hadoop.parquet.enable.summary-metadata=false --conf spark.yarn.maxAppAttempts=1

As always if you like the answer please up vote the answer.

View solution in original post

4 REPLIES 4

avatar
New Member
@elliot gimple

the correct way to pass multiple configuration options is to specify them individually. The following should work for your example:

spark-submit --conf spark.hadoop.parquet.enable.summary-metadata=false --conf spark.yarn.maxAppAttempts=1

As always if you like the answer please up vote the answer.

avatar
Not applicable

I had one more question if I need the arguments to be in quotes then --conf "A" --conf "B" for the arguments doesnt work. Just curious if you happen to know how pass two arguments in quotes for the spark submit.

Thanks for the answer :).

avatar
New Member

I believe single quote should work. Try --conf 'some.config' --conf 'other.config'.

avatar
Cloudera Employee

Hi,

 

The correct way to pass the multiple configurations is that it should be passed along with the --conf.

Ex:

spark-submit --conf org.spark.metadata =false --conf spark.driver.memory=10gb

 

Thanks

AK