Member since
07-11-2015
3
Posts
1
Kudos Received
1
Solution
My Accepted Solutions
Title | Views | Posted |
---|---|---|
2591 | 08-02-2015 03:40 AM |
08-02-2015
03:40 AM
The yarn.nodemanager.resource.memory-mb parameter was set to 8GB which was causing the problem. I have reset this to a higher value , which helped me resolve this behavior.
... View more
08-01-2015
09:13 PM
Hi, I have a 2 node cluster (each node having 32 GB RAM and 8 Cores). I have installed CDH 5.4. But I dont think the available memory is being utilized by hadoop, as the page :8088/cluster/apps shows up only 16 GB in "Memory Total" column. Only once I could see that the "Memory Total " as 64GB, not sure whats going on. What could be the reason? Thanks, Baahu
... View more
Labels:
07-11-2015
05:11 AM
1 Kudo
Hi, I am facing a strange issue,I have a single node install of CDH 5.4. I am trying to run spark jobs. I see that only first job runs , and any jobs submiited after the first job get stuck in ACCEPTED state. What could be th issue? Any limits that I might have accidentally set? Thanks, Baahu
... View more
Labels: