Reply
Highlighted
New Contributor
Posts: 3
Registered: ‎07-11-2015
Accepted Solution

Hadoop not utilizing available memory

Hi,

I have a 2 node cluster (each node having 32 GB RAM and 8 Cores). I have installed CDH 5.4.

But I dont think the available memory is being utilized by hadoop, as the page :8088/cluster/apps shows up only 16 GB in "Memory Total" column.

Only once I could see that the "Memory Total " as 64GB, not sure whats going on.

What could be the reason?

 

Thanks,

Baahu

New Contributor
Posts: 3
Registered: ‎07-11-2015

Re: Hadoop not utilizing available memory

The yarn.nodemanager.resource.memory-mb parameter was set to 8GB which was causing the problem.

I have reset this to a higher value , which helped me resolve this behavior.

Posts: 1,039
Kudos: 129
Solutions: 62
Registered: ‎04-06-2015

Re: Hadoop not utilizing available memory

Congratulations on solving your issue. Thank you for sharing the solution as it may also help others. :)



Cy Jervis, Community Manager


Was your question answered? Make sure to mark the answer as the accepted solution.
If you find a reply useful, say thanks by clicking on the thumbs up button.

Learn more about the Cloudera Community:

Terms of Service

Community Guidelines

How to use the forum