Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

Hadoop not utilizing available memory

avatar
New Contributor

Hi,

I have a 2 node cluster (each node having 32 GB RAM and 8 Cores). I have installed CDH 5.4.

But I dont think the available memory is being utilized by hadoop, as the page :8088/cluster/apps shows up only 16 GB in "Memory Total" column.

Only once I could see that the "Memory Total " as 64GB, not sure whats going on.

What could be the reason?

 

Thanks,

Baahu

1 ACCEPTED SOLUTION

avatar
New Contributor

The yarn.nodemanager.resource.memory-mb parameter was set to 8GB which was causing the problem.

I have reset this to a higher value , which helped me resolve this behavior.

View solution in original post

2 REPLIES 2

avatar
New Contributor

The yarn.nodemanager.resource.memory-mb parameter was set to 8GB which was causing the problem.

I have reset this to a higher value , which helped me resolve this behavior.

avatar
Community Manager

Congratulations on solving your issue. Thank you for sharing the solution as it may also help others. 🙂


Cy Jervis, Manager, Community Program
Was your question answered? Make sure to mark the answer as the accepted solution.
If you find a reply useful, say thanks by clicking on the thumbs up button.