Could someone clarify how to select the RAM size for below Hadoop components?
Totally depends on the data, Though Hortonworks has the recommendations, finally it will be how much data data and how many jobs are running.
Thanks for the reply. But i need to know the basic proposition atleast. Say I'm having 20 GB data in my hadoop and quite few jobs. So now what might be the good RAM size for all the components which i have metioned .
Also depends on how much data is being processed by each job and how intensive is your processing or transformation of data. Going by average system you could look at 128 G