Member since
11-07-2024
3
Posts
2
Kudos Received
0
Solutions
12-18-2024
05:29 PM
@Bharati Any updates? I have tried the REST API method. Getting my app’s expiry time works just fine. But put request get a 401 unauthorized error. It’s a shared cluster and I don’t have the admin-level authorization.
... View more
12-12-2024
06:25 PM
1 Kudo
Admin only set a value of 86400 for yarn.scheduler.capacity.root.<queue-path>.default-application-lifetime. Which should be a soft enforcement in a sense. Value for yarn.scheduler.capacity.<queue-path>.maximum-application-lifetime is not set. My question is how to configure my pyspark application to bypass this soft enforcement.
... View more
12-11-2024
01:46 AM
1 Kudo
My pyspark yarn-client application got killed by cluster because of this setting, yarn.scheduler.capacity.root.default-application-lifetime. What config should I use to declare my application lifetime and avoid getting killed?
... View more
Labels:
- Labels:
-
Apache Spark