Support Questions

Find answers, ask questions, and share your expertise

Memory Utilization is high unable to find what causing this

avatar

41625-capture.png

1 ACCEPTED SOLUTION

avatar
Super Collaborator

Hi @deepak rathod Click on the Scheduler link on the Resource Manager page. It will tell you what is the resource utilization for each queue and and you can drill down to identify which jobs are consuming total of 279 containers ? Appears that you have 279 containers running with avg. of 12 GB , with total of 3.41 TB memory reserved.

View solution in original post

7 REPLIES 7

avatar
Super Collaborator

Hi @deepak rathod Click on the Scheduler link on the Resource Manager page. It will tell you what is the resource utilization for each queue and and you can drill down to identify which jobs are consuming total of 279 containers ? Appears that you have 279 containers running with avg. of 12 GB , with total of 3.41 TB memory reserved.

avatar

Hi @Saumil Mayani i found jobs their container comsumption but unable to drill down to see the memory used by each job. is there any way to drill down to memory consumption level?

avatar
Super Collaborator

Hi @deepak rathod Could you please share the screen capture of the scheduler page. It should have for each application, Allocated Memory and vCores, allong with Running Containers. you can sort and see which application has the most Allocated Memory MB. Sample attached.

40042-screen-shot-2017-10-30-at-21401-pm.png

avatar

41630-capture1.png

This is my scheduler @Saumil Mayani

avatar
Super Collaborator

Hi @deepak rathod appears that the hdp version you are running, does not have this information on YARN Resource Manager UI. Sample screenshot I attached earlier was from HDP-2.6.2.0-205. You may need to upgrade HDP stack.

avatar

@Saumil Mayani but my version is Hadoop 2.7.1.2.3.2.0-2950

avatar
Super Collaborator