I am a student of the University of the Basque Country and I am trying to deploy a small cluster with HDP. I was surfing across the installation guide of Hortonworks, but I didn't find the answer I want. My question is:
How much memory and disk space is needed to deploy a cluster of 2 initial nodes with Ambari and HDP? I have 25GB of disk and 4GB of memory per node? Is it enough?
Thank you so much!
I saw that page, but there are only the minimum requirements for the installation of Ambari. Inthis case, if I follow the minimum specifications of installing Ambari, would I get the needed requirements for installing HDP correctly? Thank you for your quick answer!
The components I want to install are the next ones: AMBARI, ZEPPELIN, SQOOP, RANGER, YARN, HDFS, SPARK (Sparksql, Python, R). Thank you for your answer!
Install all master components on master machine which has more than 10GB of memory and 50GB of Disk space. After that On slave node add more disk space depending on the usage requirement and 4GB memory will be sufficient for that.