Support Questions

Find answers, ask questions, and share your expertise
Announcements
Check out our newest addition to the community, the Cloudera Data Analytics (CDA) group hub.

Allocation of Datanode Memory and Nodemanger memory - Clarification Newbie

Explorer

I am newbie to Cloudera Hadoop . Working on Mapreduce program.

I would like to increase the datanode memory ,  nodemanager memory ,

 

Should I have to increase in the hadoop-env.sh and Yarn-site.xml in the below tags 

I have been allocated around 128GB of RAM in my slave nodes.

 

So is there any base line to start with - more of a crap shot. My program is indeed a  heavy computation.

 

if I want to increase the datanode memory should i have to the put the numbers inside  HADOOP_DATANODE_OPTS  ? or is there any other tags ? please help me on this. 

 

yarn.nodemanager.resource.memory-mb
yarn.app.mapreduce.am.resource.mb
mapreduce.map.memory.mb
mapreduce.reduce.memory.mb	

any information is highly appreciable.

 

Thanks

4 REPLIES 4

Champion

Explorer

@saranvisa

 

Thanks for the response . Read your thread . In that post everything is pertain to Mapreduce . 

we have 128 GB allocated for Slave nodes. 

 

Could you please let me know what should go inside the base number to start with. 

yarn.nodemanager.resource.memory-mb

we have 2  hexa cores  runining . please let me know if this correct 

yarn.nodemanager.resource.cpu-vcores = 12 
yarn.scheduler.minimum-allocation-vcores =	1
yarn-scheduler.maximum-allocation-vcores =  10

what should go inside 

 

yarn.scheduler.minimum-allocation-mb	
yarn.scheduler.maximum-allocation-mb

Champion
Take a look here, it gives some good calcs to reach a baseline for the YARN/MR memory settings. Cores are straightforward leave a core or two for the other process and the rest to YARN containers.

All of these changes should be done through CM, if not using CM then in the yarn-site.xml and mapred-site.xml.

http://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.4.3/bk_installing_manually_book/content/determin...

Community Manager

Here is one of our Community Knowledge articles that may also be of assistance when calculating memory size.

 

Selecting the Right Hardware for Your New Hadoop Cluster


Cy Jervis, Manager, Community Program
Was your question answered? Make sure to mark the answer as the accepted solution.
If you find a reply useful, say thanks by clicking on the thumbs up button.
Take a Tour of the Community
Don't have an account?
Your experience may be limited. Sign in to explore more.