Created 10-25-2017 11:02 AM
how to know who eat yarn memory ?
I ask because we have exactly two ambari cluster
on one ambari cluster yarn memory is almost 100%
on the second yarn memory is 50%
so how to know who eat the yarn memory on the first cluster?
Created on 10-25-2017 06:39 PM - edited 08-17-2019 06:14 PM
You can check it from YARN Resource Manager UI (RM UI).
From Ambari YARN page, open RM UI
From RM UI, you can have a look at the application which are running under YARN. From there you can look into memory consumption by each application, and compare your clusters for discrepancy.
RM UI showing list of apps (with Allocated Memory).
You can click on a specific app to have a detailed look on queue and memory used.
Created on 10-25-2017 06:39 PM - edited 08-17-2019 06:14 PM
You can check it from YARN Resource Manager UI (RM UI).
From Ambari YARN page, open RM UI
From RM UI, you can have a look at the application which are running under YARN. From there you can look into memory consumption by each application, and compare your clusters for discrepancy.
RM UI showing list of apps (with Allocated Memory).
You can click on a specific app to have a detailed look on queue and memory used.
Created 10-25-2017 07:19 PM
thx for this details , I opened the page and I see many spark app opened , so do you think if I will restart the spark it will solved this isshue
Created 10-25-2017 07:31 PM
I belive you need to figure why multiple Spark apps are running. If this is not a production cluster, and no one is going to get affected out of restarting SPARK, you can look into that option.
But this just makes me to believe that the configuration settings for SPARK on how many SPARK apps are supposed to run is most probably the difference between two of your clusters.
I am not an expert in SPARK to point you to the correct config to look for.