Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

Hadoop not utilizing available memory

SOLVED Go to solution

Hadoop not utilizing available memory

New Contributor

Hi,

I have a 2 node cluster (each node having 32 GB RAM and 8 Cores). I have installed CDH 5.4.

But I dont think the available memory is being utilized by hadoop, as the page :8088/cluster/apps shows up only 16 GB in "Memory Total" column.

Only once I could see that the "Memory Total " as 64GB, not sure whats going on.

What could be the reason?

 

Thanks,

Baahu

1 ACCEPTED SOLUTION

Accepted Solutions

Re: Hadoop not utilizing available memory

New Contributor

The yarn.nodemanager.resource.memory-mb parameter was set to 8GB which was causing the problem.

I have reset this to a higher value , which helped me resolve this behavior.

2 REPLIES 2

Re: Hadoop not utilizing available memory

New Contributor

The yarn.nodemanager.resource.memory-mb parameter was set to 8GB which was causing the problem.

I have reset this to a higher value , which helped me resolve this behavior.

Highlighted

Re: Hadoop not utilizing available memory

Community Manager

Congratulations on solving your issue. Thank you for sharing the solution as it may also help others. :)



Cy Jervis, Community Manager

Was your question answered? Make sure to mark the answer as the accepted solution.
If you find a reply useful, say thanks by clicking on the thumbs up button.

Learn more about the Cloudera Community:
Community Guidelines
How to use the forum