Created 02-20-2019 10:37 AM
i have a large cluster installed currently it is used only for spark on yarn jobs
what services on the master i can remove\uninstall to free up some ram?
one server to accept spark jobs currently have the following services:
and multiple clients with:
Created 02-20-2019 11:09 AM
Bit correction here.
I just checkecd that Spark2 has a dependency on HIve Service.
# grep -A1 -B1 'HIVE' /var/lib/ambari-server/resources/common-services/SPARK2/2.0.0/metainfo.xml <dependency> <name>HIVE/HIVE_METASTORE</name> <scope>cluster</scope>
Hence you should not delete the HIVE service .. but if you are not using Hive then just Stop the "Hive Service" components that way it will not use RAM and put the Hive Service in Maintenance Mode.
Created 02-20-2019 10:54 AM
If you are not using Hive Service (HIve Metastore / HiveServer2 / Hive Client) then you can remove it.
Also AMS collector which internally starts an HBase instance to store the cluster & services related metrics data ... So if you are not much interested in Metrics Data then you can also Delete the AMS service (AMS Collector / Grafana / Metrics Monitors).
Created 02-20-2019 11:02 AM
Thanks for the quick response,
Can i delete all hive?
Metrics are useful. i can see some bottlenecks and improve the performance,
Currently still struggling the improve the time it take to allocate the containers each job ( all the time they are the same and it take 30-60 seconds every job to allocate them )
Btw why it is a requirements (in the install stage it was selected cause i needed spark\yarn)
Created 02-20-2019 11:09 AM
Bit correction here.
I just checkecd that Spark2 has a dependency on HIve Service.
# grep -A1 -B1 'HIVE' /var/lib/ambari-server/resources/common-services/SPARK2/2.0.0/metainfo.xml <dependency> <name>HIVE/HIVE_METASTORE</name> <scope>cluster</scope>
Hence you should not delete the HIVE service .. but if you are not using Hive then just Stop the "Hive Service" components that way it will not use RAM and put the Hive Service in Maintenance Mode.
Created 02-20-2019 11:15 AM
ok, just stopped 3 hive services, still it abit bother me that there are red dots in the ui :),
Btw why there are a dependency on hive?
Created 02-20-2019 11:21 AM
Spark can be used to interact with Hive.
When you install Spark using Ambari, the hive-site.xml
file is automatically populated with the Hive metastore location.