Created 05-26-2017 02:34 PM
The docs here same to place it in key value format https://spark.apache.org/docs/1.6.1/running-on-yarn.html
But unfortunately did not give a clear example I want to have 2 configurations set.
spark-submit --conf "spark.hadoop.parquet.enable.summary-metadata=false;spark.yarn.maxAppAttempts=1" etc..
Is this the correct way of doing it and if not what would be the correct way.
I am using spark 1.6
Thank you
Created 05-26-2017 04:45 PM
the correct way to pass multiple configuration options is to specify them individually. The following should work for your example:
spark-submit --conf spark.hadoop.parquet.enable.summary-metadata=false --conf spark.yarn.maxAppAttempts=1
As always if you like the answer please up vote the answer.
Created 05-26-2017 04:45 PM
the correct way to pass multiple configuration options is to specify them individually. The following should work for your example:
spark-submit --conf spark.hadoop.parquet.enable.summary-metadata=false --conf spark.yarn.maxAppAttempts=1
As always if you like the answer please up vote the answer.
Created 05-26-2017 04:50 PM
I had one more question if I need the arguments to be in quotes then --conf "A" --conf "B" for the arguments doesnt work. Just curious if you happen to know how pass two arguments in quotes for the spark submit.
Thanks for the answer :).
Created 05-26-2017 04:56 PM
I believe single quote should work. Try --conf 'some.config' --conf 'other.config'.
Created 12-09-2019 11:32 PM
Hi,
The correct way to pass the multiple configurations is that it should be passed along with the --conf.
Ex:
spark-submit --conf org.spark.metadata =false --conf spark.driver.memory=10gb
Thanks
AK