Support Questions

Find answers, ask questions, and share your expertise
Announcements
Now Live: Explore expert insights and technical deep dives on the new Cloudera Community BlogsRead the Announcement

YARN used 2000% of memory

avatar
New Member

Hi,

We have a cluster of 5 nodes currently running HDP2.0. Recently we observed that YARN is using 2000% of the memory.

Currently we allocated 2GB for yarn memory and the metrics showed 40GB was used for our current job. All nodes are still "alive". Will that be a problem? Should we increase the allocated memory for yarn cluster?

2178-ambari-elephant2-1.png

2179-ambari-elephant-1.png

1 ACCEPTED SOLUTION

avatar
Master Mentor

@Jade Liu

You should look into the option to upgrade your system.

HDP and Ambari needs to be upgraded.

You can increase yarn memory. Restart yarn to see if alert goes away

View solution in original post

1 REPLY 1

avatar
Master Mentor

@Jade Liu

You should look into the option to upgrade your system.

HDP and Ambari needs to be upgraded.

You can increase yarn memory. Restart yarn to see if alert goes away