Member since
06-08-2020
3
Posts
0
Kudos Received
0
Solutions
10-21-2022
10:49 AM
My Yarn queue has total vcores configured as150 and memory of 850 gb.My spark job is utilising all the V cores available which is 150, but only 1/3rd of total memory is currently utilised: however when new jobs are started they are failing with message - unable to allocate yarn resources. how can I reduce my vcores allocation for spark job.
... View more
Labels:
- Labels:
-
Apache Spark
-
Apache YARN
-
Cloudera Manager