Archives of Support Questions (Read Only)

This is an archived board for historical reference. Information and links may no longer be available or relevant
Announcements
This board is archived and read-only for historical reference. To ask a new question, please post a new topic on the appropriate active board.

YARN used 2000% of memory

avatar
New Member

Hi,

We have a cluster of 5 nodes currently running HDP2.0. Recently we observed that YARN is using 2000% of the memory.

Currently we allocated 2GB for yarn memory and the metrics showed 40GB was used for our current job. All nodes are still "alive". Will that be a problem? Should we increase the allocated memory for yarn cluster?

2178-ambari-elephant2-1.png

2179-ambari-elephant-1.png

1 ACCEPTED SOLUTION

avatar
Master Mentor

@Jade Liu

You should look into the option to upgrade your system.

HDP and Ambari needs to be upgraded.

You can increase yarn memory. Restart yarn to see if alert goes away

View solution in original post

1 REPLY 1

avatar
Master Mentor

@Jade Liu

You should look into the option to upgrade your system.

HDP and Ambari needs to be upgraded.

You can increase yarn memory. Restart yarn to see if alert goes away