Support Questions

Find answers, ask questions, and share your expertise

Java Heap Memory Calculation for components in cdh5 ??

Explorer

Is there any best practice "Java Heap Memory" calculation for cdh components, hdfs,hive,yarn,mapreduce so on..?? I referred the documentation bit complicated to understand, keenly looking for best practice in heap memory calculation using cdh5. Thanks in advance 

4 REPLIES 4

Champion

@vkrish

 

Not sure which document that you have referred (in case if you have referred a different link then )

 

The below link has an excel sheet, you have to download and fill as needed to get the recommeded values

https://www.cloudera.com/documentation/enterprise/5-10-x/topics/cdh_ig_yarn_tuning.html

 

 

Explorer

@saranvisa

 

Thanks for your time in responding to my query. I am referring to  the below 

https://www.cloudera.com/documentation/enterprise/5-10-x/topics/cm_mc_autoconfig.html 

 

Memory  Option 

 

     Is it the right approach to refer, shared url. In which one is standard to follow. Please clarify ??

 

Thanks 

    @vkrish

Champion

@vkrish

 

You have to follow any of those links depends upon your requirement.

 

1.

https://www.cloudera.com/documentation/enterprise/5-10-x/topics/cm_mc_autoconfig.html

if you want to install, upgrade, add a new service, add a cluster, etc then you can follow your link. Mostly this is one time work, as it will be used for only initial setup. Also this is an optional for small clusters. May be suitable for large cluster. 

 

2.

https://www.cloudera.com/documentation/enterprise/5-10-x/topics/cdh_ig_yarn_tuning.html

if you are using mapreduce, yarn, hive, spark with mr, sqoop, etc and if you want to tune those jobs then you can use the link that i've shared. Whenever you have install/configure your cluster, it will come with default setttings, the default may not suitable for all the scenarios, so this link will give you the instruction to customize your environment for better performance

Explorer

@saranvisa

 

      Thanks for your reply. Is there any easiest way to get the concept of Memory Allocation for Hadoop Components in the autoconfiguration concept. It is bit confusing to understand. 

 

 

 

@vkrish

Thanks in advance.