Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

Reason: Container killed by YARN for exceeding memory limits.56.3 GB of 31 GB physical memory used

Reason: Container killed by YARN for exceeding memory limits.56.3 GB of 31 GB physical memory used

New Contributor

I am getting memory limit to exceed error while running spark job. I have tried a couple of things but still no luck. I want to understand why it's going till 56.3GB and then failing. Any leads with the solution will be really helpful

 

Executor Memory: 25GB

executor.memoryOverhead = 0.1 <default>

Total Node: 250

32 core nodes

96 GB Memory

 

 

1 REPLY 1
Highlighted

Re: Reason: Container killed by YARN for exceeding memory limits.56.3 GB of 31 GB physical memory us

Cloudera Employee

Hi,

 

Could you please share the Full Error stack trace for further analysis? Also, did u tried to reduce the Executor Memory and tried as a trail basis?

 

Thanks

AK