AT this point we are running HDP cluster managed by Ambari for purely educational purposes - like going through Hadoop tutorial examples. We use one node cluster running on computer with 6Gb of memory.
This is constantly causing numerous problems because of lack of memory. Tasks are failing, overtiming, running extremely slow because of intensive memory swap to a hard drive.
The default memory settings in HDP are requesting gigabytes of memory for almost everything. Which is probably good in a production environment but does not make much sense in education environment where we deal with tiny amount of data.
Can please anybody recommend some optimal HDP memory settings for a computer with moderate memory? There are something like 100+ memory related settings in various services configurations and it is really difficult for a Hadoop beginner to understand which ones are most important and which are secondary.
@Dmitry Otblesk I will suggest you to shutdown few services which you are not using as part of tutorial like HBASE,STORM,ATLAS etc an many more, you can start them once you are covering the tutorial related to these.
The memory requirement is not driven by little data in the POC environment. It is driven by so many services running in the ecosystem. Even the sandbox requires 8 GB. When you installed this single node cluster, most likely Ambari report some memory limits.
It is not a surprise that such a complex ecosystem requires 8 GB. Even Eclipse IDE requires this days 8 GB to run nicely and that is a development tool.
My 2c is to reset your expectations and as suggested below. Stop some of the services and keep up only those absolutely needed. Sandbox with 8 GB is enough to run through tutorials with small data. If you want to do more than that 16 GB is the minimum.
Is there any document, which outlines minimal hardware requirements for HDP?
So far I was only able to find document with recommended requirements. These are very different from minimal requirements as recommended requirements have production environment in mind.
>most likely Ambari report some memory limits
Where exactly Ambari reports this? If you are talking about Dasboard then no, it does not currently show any memory complaints. There are no alerts shown either. Is there any particular place in Ambari where I can check this?