Community Articles
Find and share helpful community-sourced technical articles
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

SYMPTOM: We have some alerts about high heap size in datanode in Ambari for production cluster. The maximum of heap size of the datanode is set to 16G

ERROR: Below is the snapshot

10750-screen-shot-2016-12-24-at-121119-am.png

ROOT CAUSE: DN operations are IO expensive do not require 16GB of the heap.

RESOLUTION: Tuning GC parameters resolved the issue -

4GB Heap recommendation : 
-Xms4096m -Xmx4096m -XX:NewSize=800m 
-XX:MaxNewSize=800m -XX:+UseParNewGC 
-XX:+UseConcMarkSweepGC 
-XX:+UseCMSInitiatingOccupancyOnly 
-XX:CMSInitiatingOccupancyFraction=70 
-XX:ParallelGCThreads=8 
2,934 Views
Comments
Expert Contributor

@Sagar Shimpi

I have this issue in my HDP 2.4.2 cluster since midnight because I see some are high on Datanode heap size for more than 10 hours now. I see your resolution but can you be more specific where to change these parameters? should i change them in hadoop-env.sh? and how?

@PJ If you are using ambari, then you need to modify Services->HDFS->Configs->"hadoop-env template" [depending on your java version you are using ie. >Java 8 or <Java 8]

Expert Contributor

I will just try that

Don't have an account?
Coming from Hortonworks? Activate your account here
Version history
Revision #:
2 of 2
Last update:
‎08-17-2019 06:51 AM
Updated by:
 
Contributors
Top Kudoed Authors