Reply
Highlighted
New Contributor
Posts: 1
Registered: ‎06-20-2019

Reason: Container killed by YARN for exceeding memory limits.56.3 GB of 31 GB physical memory used

[ Edited ]

I am getting memory limit to exceed error while running spark job. I have tried a couple of things but still no luck. I want to understand why it's going till 56.3GB and then failing. Any leads with the solution will be really helpful

 

Executor Memory: 25GB

executor.memoryOverhead = 0.1 <default>

Total Node: 250

32 core nodes

96 GB Memory

 

 

Cloudera Employee AKR
Cloudera Employee
Posts: 31
Registered: ‎09-20-2018

Re: Reason: Container killed by YARN for exceeding memory limits.56.3 GB of 31 GB physical memory us

Hi,

 

Could you please share the Full Error stack trace for further analysis? Also, did u tried to reduce the Executor Memory and tried as a trail basis?

 

Thanks

AK