Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

My spark job is utilising all V cores configured for Yarn queue: New jobs failing

avatar
New Contributor

My Yarn queue has total vcores configured as150 and memory of 850 gb.My spark job is utilising all the V cores available which is 150, but only 1/3rd of total memory is currently utilised: however when new jobs are started they are failing  with message - unable to allocate yarn resources.

how can I reduce my vcores allocation for spark job.

0 REPLIES 0