Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

Is there a formula for computing the amount of memory required for JVM containers?

Highlighted

Is there a formula for computing the amount of memory required for JVM containers?

Explorer

We are trying to configure new server. Is there a formula that can be used to determine optimum memory required for JVM containers? We are using hdfs, yarn, solr, mapreduce...

 

1 REPLY 1

Re: Is there a formula for computing the amount of memory required for JVM containers?

Expert Contributor

@sbd4q0 This doc can help you to understand the heap requirements.

https://docs.cloudera.com/documentation/enterprise/release-notes/topics/hardware_requirements_guide....

 

Also generally we consider that 1Million Blocks=1G Heap size needed. 

The calculation to determine Hadoop memory over-commit per Host is as follows:

commit = available_memory_for_hadoop - total_hadoop_java_heap - impala_memory 
if (total_system_memory * 0.8) < ( sum(java_heap_of_processes) * 1.3 + impala_memory) then flag over-committed
Don't have an account?
Coming from Hortonworks? Activate your account here