Created 02-20-2019 12:03 PM
when submitting a job that failed, it try to run with another attemped, how i can disable the second run ?
what configuration param i must set?
Created 02-20-2019 12:07 PM
Do you want to control the number of attempts? If yes then you might be interested in the following property "spark.yarn.maxAppAttempts"
Example:
--conf spark.yarn.maxAppAttempts=1
https://spark.apache.org/docs/latest/running-on-yarn.html
spark.yarn.maxAppAttempts | yarn.resourcemanager.am.max-attempts in YARN | The maximum number of attempts that will be made to submit the application. It should be no larger than the global number of max attempts in the YARN configuration. |
Created 02-20-2019 12:07 PM
Do you want to control the number of attempts? If yes then you might be interested in the following property "spark.yarn.maxAppAttempts"
Example:
--conf spark.yarn.maxAppAttempts=1
https://spark.apache.org/docs/latest/running-on-yarn.html
spark.yarn.maxAppAttempts | yarn.resourcemanager.am.max-attempts in YARN | The maximum number of attempts that will be made to submit the application. It should be no larger than the global number of max attempts in the YARN configuration. |
Created 02-20-2019 12:15 PM
i want to disable second attempt
Created 02-20-2019 12:18 PM
While submitting your Spark Job csan you try passing the
# spark-submit --master yarn --deploy-mode cluster --conf spark.yarn.maxAppAttempts=1. .............
.
Or try setting "yarn.resourcemanager.am.max-attempts" to 1 (default may be 2) in Ambari UI --> yarn --> Configs --> Advanced --> Advanced yarn-site
Created 02-20-2019 01:26 PM
i think that did what i needed. thanks