Support Questions
Find answers, ask questions, and share your expertise

Memory Utilization is high unable to find what causing this

41625-capture.png

1 ACCEPTED SOLUTION

Expert Contributor

Hi @deepak rathod Click on the Scheduler link on the Resource Manager page. It will tell you what is the resource utilization for each queue and and you can drill down to identify which jobs are consuming total of 279 containers ? Appears that you have 279 containers running with avg. of 12 GB , with total of 3.41 TB memory reserved.

View solution in original post

7 REPLIES 7

Expert Contributor

Hi @deepak rathod Click on the Scheduler link on the Resource Manager page. It will tell you what is the resource utilization for each queue and and you can drill down to identify which jobs are consuming total of 279 containers ? Appears that you have 279 containers running with avg. of 12 GB , with total of 3.41 TB memory reserved.

Hi @Saumil Mayani i found jobs their container comsumption but unable to drill down to see the memory used by each job. is there any way to drill down to memory consumption level?

Expert Contributor

Hi @deepak rathod Could you please share the screen capture of the scheduler page. It should have for each application, Allocated Memory and vCores, allong with Running Containers. you can sort and see which application has the most Allocated Memory MB. Sample attached.

40042-screen-shot-2017-10-30-at-21401-pm.png

41630-capture1.png

This is my scheduler @Saumil Mayani

Expert Contributor

Hi @deepak rathod appears that the hdp version you are running, does not have this information on YARN Resource Manager UI. Sample screenshot I attached earlier was from HDP-2.6.2.0-205. You may need to upgrade HDP stack.

@Saumil Mayani but my version is Hadoop 2.7.1.2.3.2.0-2950

Expert Contributor
Take a Tour of the Community
Don't have an account?
Your experience may be limited. Sign in to explore more.