Support Questions
Find answers, ask questions, and share your expertise

Recommended YARN memory settings for 32GB of RAM


I'm interested in determining the correct YARN memory settings for our Hadoop cluster configured with nodes containing 32GB RAM. We have applied the recommended settings (via Ambari) using yarn-utils script referenced in the "Determining HDP Memory Configuration Settings" article (see link below). It seems to be working fine (most of the time). However I started to question the settings a bit when one of my more experienced team members thought the settings looked too high. I noticed there are no values recommended for nodes with 32GB of RAM in the Reserved Memory Allocations Table. Is there a specific reason for this?

Also, I was looking at the yarn-utils python script. It has two tables that are used to calculate reserved memory for OS/HBase etc. Although there are no values for 32GB in those tables either. I was wondering if anyone else has come across this and what settings they applied for 32GB or RAM? Any suggestions are very much appreciated! 🙂 Thanks!

Here are the outputs from running the yarn-utils script (Note I modified it slightly to add some log statements):

$ python -c 4 -m 32 -d 1 -k True

Using cores=4 memory=32GB disks=1 hbase=True

Min Container Size:2048

Reserved Stack Memory:1

Reserved Hbase Memory:2

Profile: cores=4 memory=29696MB reserved=3GB usableMem=29GB disks=1

Num Container=3

Container Ram=9728MB

Used Ram=28GB

Unused Ram=3GB