I'm interested in determining the correct YARN memory settings for our Hadoop cluster configured with nodes containing 32GB RAM. We have applied the recommended settings (via Ambari) using yarn-utils script referenced in the "Determining HDP Memory Configuration Settings" article (see link below). It seems to be working fine (most of the time). However I started to question the settings a bit when one of my more experienced team members thought the settings looked too high. I noticed there are no values recommended for nodes with 32GB of RAM in the Reserved Memory Allocations Table. Is there a specific reason for this?
Also, I was looking at the yarn-utils python script. It has two tables that are used to calculate reserved memory for OS/HBase etc. Although there are no values for 32GB in those tables either. I was wondering if anyone else has come across this and what settings they applied for 32GB or RAM? Any suggestions are very much appreciated! :) Thanks!