- Subscribe to RSS Feed
- Mark Question as New
- Mark Question as Read
- Float this Question for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page
spark\yarn disable second attempt on failure
- Labels:
-
Apache Spark
-
Apache YARN
Created ‎02-20-2019 12:03 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
when submitting a job that failed, it try to run with another attemped, how i can disable the second run ?
what configuration param i must set?
Created ‎02-20-2019 12:07 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Do you want to control the number of attempts? If yes then you might be interested in the following property "spark.yarn.maxAppAttempts"
Example:
--conf spark.yarn.maxAppAttempts=1
https://spark.apache.org/docs/latest/running-on-yarn.html
spark.yarn.maxAppAttempts | yarn.resourcemanager.am.max-attempts in YARN | The maximum number of attempts that will be made to submit the application. It should be no larger than the global number of max attempts in the YARN configuration. |
Created ‎02-20-2019 12:07 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Do you want to control the number of attempts? If yes then you might be interested in the following property "spark.yarn.maxAppAttempts"
Example:
--conf spark.yarn.maxAppAttempts=1
https://spark.apache.org/docs/latest/running-on-yarn.html
spark.yarn.maxAppAttempts | yarn.resourcemanager.am.max-attempts in YARN | The maximum number of attempts that will be made to submit the application. It should be no larger than the global number of max attempts in the YARN configuration. |
Created ‎02-20-2019 12:15 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
i want to disable second attempt
Created ‎02-20-2019 12:18 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
While submitting your Spark Job csan you try passing the
# spark-submit --master yarn --deploy-mode cluster --conf spark.yarn.maxAppAttempts=1. .............
.
Or try setting "yarn.resourcemanager.am.max-attempts" to 1 (default may be 2) in Ambari UI --> yarn --> Configs --> Advanced --> Advanced yarn-site
Created ‎02-20-2019 01:26 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
i think that did what i needed. thanks
