Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

Spark analyze a job - performace tuning

Spark analyze a job - performace tuning

New Contributor

Hello Everyone - We have a Hadoop cluster where Spark runs on Yarn. I dont have much knowledge of what to look or where to look in a spark job histroy server if a Spark application(query) is taking long time.

If you share some details of how to analyze a spark job, and any keywords or phrases to look for.

1 REPLY 1

Re: Spark analyze a job - performace tuning

Super Guru
@Bharath

If the job is still running then you can go to YARN-Resource manager UI(usually 8088 port) and click on applicationmaster link,This will redirect to spark job UI and you can get the detailed information how many tasks are running/finished/failed with time took for each task to finish.

-

If you want to debug the job that is finished/failed then go to spark history server(usually runs on 18080 port), then you are able to see time taken for each task to execute and DAG of the spark job also.

-

Starting from Spark2(scala only) we can use spark.time to calculate the time took for the action to execute.

-

You can also execute explain() plan to get detailed information of the execution of spark job.

Useful links: link1 link2 link3