Created 09-15-2016 07:08 AM
Hi everyone!
I am a student of the University of the Basque Country and I am trying to deploy a small cluster with HDP. I was surfing across the installation guide of Hortonworks, but I didn't find the answer I want. My question is:
How much memory and disk space is needed to deploy a cluster of 2 initial nodes with Ambari and HDP? I have 25GB of disk and 4GB of memory per node? Is it enough?
Thank you so much!
Created 09-15-2016 07:39 AM
Hi,
Please follow this document for requirements of HDP,
Created 09-15-2016 07:39 AM
Hi,
Please follow this document for requirements of HDP,
Created 09-15-2016 07:49 AM
I saw that page, but there are only the minimum requirements for the installation of Ambari. Inthis case, if I follow the minimum specifications of installing Ambari, would I get the needed requirements for installing HDP correctly? Thank you for your quick answer!
Created 09-15-2016 07:57 AM
add more space to Slave node and add memory to Master node. What services you want to install on hdp?
Created 09-15-2016 08:04 AM
The components I want to install are the next ones: AMBARI, ZEPPELIN, SQOOP, RANGER, YARN, HDFS, SPARK (Sparksql, Python, R). Thank you for your answer!
Created 09-15-2016 08:08 AM
Install all master components on master machine which has more than 10GB of memory and 50GB of Disk space. After that On slave node add more disk space depending on the usage requirement and 4GB memory will be sufficient for that.
Created 09-15-2016 08:08 AM
Install slave component on slave node as well.
Created 09-15-2016 08:23 AM
Thank you so much for your help! I will try it that way!
Created 09-15-2016 09:48 AM
If this solved your question please accept the answer, it will closed this issue then.