Support Questions
Find answers, ask questions, and share your expertise

Hadoop not utilizing available memory

Solved Go to solution
Highlighted

Hadoop not utilizing available memory

New Contributor

Hi,

I have a 2 node cluster (each node having 32 GB RAM and 8 Cores). I have installed CDH 5.4.

But I dont think the available memory is being utilized by hadoop, as the page :8088/cluster/apps shows up only 16 GB in "Memory Total" column.

Only once I could see that the "Memory Total " as 64GB, not sure whats going on.

What could be the reason?

 

Thanks,

Baahu

1 ACCEPTED SOLUTION

Accepted Solutions
Highlighted

Re: Hadoop not utilizing available memory

New Contributor

The yarn.nodemanager.resource.memory-mb parameter was set to 8GB which was causing the problem.

I have reset this to a higher value , which helped me resolve this behavior.

View solution in original post

2 REPLIES 2
Highlighted

Re: Hadoop not utilizing available memory

New Contributor

The yarn.nodemanager.resource.memory-mb parameter was set to 8GB which was causing the problem.

I have reset this to a higher value , which helped me resolve this behavior.

View solution in original post

Re: Hadoop not utilizing available memory

Community Manager

Congratulations on solving your issue. Thank you for sharing the solution as it may also help others. :)


Cy Jervis, Community Manager

Was your question answered? Make sure to mark the answer as the accepted solution.
If you find a reply useful, say thanks by clicking on the thumbs up button.

Learn more about the Cloudera Community:
Community Guidelines
How to use the forum
Don't have an account?