Initially i had two machines to setup hadoop, spark, hbase, kafka, zookeeper, MR2
. Each of those machines had 16GB of RAM. I used Apache Ambari to setup the two machines with the above mentioned services.
Now i have upgraded the RAM of each of those machines to 128GB.
How can i now tell Ambari to scale up all its services to make use of the additional memory?
Do i need to understand how the memory is configured for each of these services?
Is this part covered in Ambari documentation somewhere?