Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

how to know who eat yarn memory

avatar

how to know who eat yarn memory ?

I ask because we have exactly two ambari cluster

on one ambari cluster yarn memory is almost 100%

on the second yarn memory is 50%

so how to know who eat the yarn memory on the first cluster?

Michael-Bronson
1 ACCEPTED SOLUTION

avatar
Expert Contributor

@uri ben-ari

You can check it from YARN Resource Manager UI (RM UI).

From Ambari YARN page, open RM UI

39955-screen-shot-2017-10-25-at-112928-am.png

From RM UI, you can have a look at the application which are running under YARN. From there you can look into memory consumption by each application, and compare your clusters for discrepancy.

RM UI showing list of apps (with Allocated Memory).

39956-screen-shot-2017-10-25-at-112723-am.png

You can click on a specific app to have a detailed look on queue and memory used.

View solution in original post

3 REPLIES 3

avatar
Expert Contributor

@uri ben-ari

You can check it from YARN Resource Manager UI (RM UI).

From Ambari YARN page, open RM UI

39955-screen-shot-2017-10-25-at-112928-am.png

From RM UI, you can have a look at the application which are running under YARN. From there you can look into memory consumption by each application, and compare your clusters for discrepancy.

RM UI showing list of apps (with Allocated Memory).

39956-screen-shot-2017-10-25-at-112723-am.png

You can click on a specific app to have a detailed look on queue and memory used.

avatar

thx for this details , I opened the page and I see many spark app opened , so do you think if I will restart the spark it will solved this isshue

Michael-Bronson

avatar
Expert Contributor

I belive you need to figure why multiple Spark apps are running. If this is not a production cluster, and no one is going to get affected out of restarting SPARK, you can look into that option.

But this just makes me to believe that the configuration settings for SPARK on how many SPARK apps are supposed to run is most probably the difference between two of your clusters.

I am not an expert in SPARK to point you to the correct config to look for.