Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

How to increase Yarn memory?

avatar
Rising Star

when I rum Hive query the Yarn memory getting almost full.So for completing job its taking more than 10 mins, I want minimize this delay.Is it possible to increase Yarn memory?

7 REPLIES 7

avatar
Expert Contributor

You should be able to over-subscribe memory by setting yarn.nodemanager.resource.memory-mb to a value higher than the actual physical memory in your nodes. Alternately, you might want to check the value of yarn.scheduler.minimum-allocation-mb and lower it a bit to accommodate for more containers.

avatar
Rising Star

Hi @Gour Saha,

Thanks for response

Where do I find those 2 properties in ambari?

avatar
Rising Star

Hi @Gour Saha,

Thanks for response

Where do I find those 2 properties in ambari?

avatar
Expert Contributor

Once you go to YARN Configs tab you can search for those properties. In latest versions of Ambari these show up in the Settings tab (not Advanced tab) as sliders. You can increase the values by moving the slider to the right or even click the edit pen to manually enter a value.

avatar
Rising Star

You should let me know your system memory of your hadoop cluster. If you have three nodes for datanode and nodemanager with 128GB RAM per node, then you can set All YARN containers memory and Min/Max Container Memory from Ambari Web. That depends on the available system memory, preferentially I recommend to set these memory options such as 1024MB or 2048 MB for Min Container Size, 4GB or 8GB or higher for Max Container Size. And All YARN Containers Memory is 90GB ~ 100GB. Of course All YARN Container Memory it depends on the all datanode's available memory for nodemanager.

avatar
Rising Star

Hi @Peter Kim,

Thanks for response.

Im having 7 node cluste with 128GB RAM,my Yarn memory is 840GB,can I increase this?

avatar
Rising Star

No. 840GB, that means a single node has almost 120GB RAM, and it's not ideal way to maintain system. Because each nodes need some free memory for other services such os applications or agents which are using by ambari and etc. Just start 90GB to 100GB, then you can slightly change for that.