Support Questions

Find answers, ask questions, and share your expertise

Spark analyze a job - performace tuning


Hello Everyone - We have a Hadoop cluster where Spark runs on Yarn. I dont have much knowledge of what to look or where to look in a spark job histroy server if a Spark application(query) is taking long time.

If you share some details of how to analyze a spark job, and any keywords or phrases to look for.


Super Guru

If the job is still running then you can go to YARN-Resource manager UI(usually 8088 port) and click on applicationmaster link,This will redirect to spark job UI and you can get the detailed information how many tasks are running/finished/failed with time took for each task to finish.


If you want to debug the job that is finished/failed then go to spark history server(usually runs on 18080 port), then you are able to see time taken for each task to execute and DAG of the spark job also.


Starting from Spark2(scala only) we can use spark.time to calculate the time took for the action to execute.


You can also execute explain() plan to get detailed information of the execution of spark job.

Useful links: link1 link2 link3