Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

Recommended YARN memory settings for 32GB of RAM

Recommended YARN memory settings for 32GB of RAM

New Contributor

I'm interested in determining the correct YARN memory settings for our Hadoop cluster configured with nodes containing 32GB RAM. We have applied the recommended settings (via Ambari) using yarn-utils script referenced in the "Determining HDP Memory Configuration Settings" article (see link below). It seems to be working fine (most of the time). However I started to question the settings a bit when one of my more experienced team members thought the settings looked too high. I noticed there are no values recommended for nodes with 32GB of RAM in the Reserved Memory Allocations Table. Is there a specific reason for this?

Also, I was looking at the yarn-utils python script. It has two tables that are used to calculate reserved memory for OS/HBase etc. Although there are no values for 32GB in those tables either. I was wondering if anyone else has come across this and what settings they applied for 32GB or RAM? Any suggestions are very much appreciated! :) Thanks!

https://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.6.0/bk_command-line-installation/content/determ...

Here are the outputs from running the yarn-utils script (Note I modified it slightly to add some log statements):

$ python yarn-utils.py -c 4 -m 32 -d 1 -k True

Using cores=4 memory=32GB disks=1 hbase=True

Min Container Size:2048

Reserved Stack Memory:1

Reserved Hbase Memory:2

Profile: cores=4 memory=29696MB reserved=3GB usableMem=29GB disks=1

Num Container=3

Container Ram=9728MB

Used Ram=28GB

Unused Ram=3GB

yarn.scheduler.minimum-allocation-mb=9728

yarn.scheduler.maximum-allocation-mb=29184

yarn.nodemanager.resource.memory-mb=29184

mapreduce.map.memory.mb=9728

mapreduce.map.java.opts=-Xmx7782m

mapreduce.reduce.memory.mb=9728

mapreduce.reduce.java.opts=-Xmx7782m

yarn.app.mapreduce.am.resource.mb=9728

yarn.app.mapreduce.am.command-opts=-Xmx7782m

mapreduce.task.io.sort.mb=3891

Don't have an account?
Coming from Hortonworks? Activate your account here