Archives of Support Questions (Read Only)

This is an archived board for historical reference. Information and links may no longer be available or relevant
Announcements
This board is archived and read-only for historical reference. To ask a new question, please post a new topic on the appropriate active board.

Hadoop not utilizing available memory

avatar
Frequent Visitor

Hi,

I have a 2 node cluster (each node having 32 GB RAM and 8 Cores). I have installed CDH 5.4.

But I dont think the available memory is being utilized by hadoop, as the page :8088/cluster/apps shows up only 16 GB in "Memory Total" column.

Only once I could see that the "Memory Total " as 64GB, not sure whats going on.

What could be the reason?

 

Thanks,

Baahu

1 ACCEPTED SOLUTION

avatar
Frequent Visitor

The yarn.nodemanager.resource.memory-mb parameter was set to 8GB which was causing the problem.

I have reset this to a higher value , which helped me resolve this behavior.

View solution in original post

2 REPLIES 2

avatar
Frequent Visitor

The yarn.nodemanager.resource.memory-mb parameter was set to 8GB which was causing the problem.

I have reset this to a higher value , which helped me resolve this behavior.

avatar
Community Manager

Congratulations on solving your issue. Thank you for sharing the solution as it may also help others. 🙂


Keep the questions coming,

Cy Jervis | Senior Manager, Knowledge Programs

if (helpful) { mark_as_solution(); } | if (appreciated) { give_kudos(); }