Support Questions

Find answers, ask questions, and share your expertise

How to monitor for failed spark job

avatar
New Contributor

I am running spark job which takes 6 hours.

1) How to monitor spark jobs running on which nodes?

2) How to diagnose failed spark jobs?

1 REPLY 1

avatar
Master Guru

@SPARK_LEARN Below doc can help you.

https://docs.cloudera.com/documentation/enterprise/5-9-x/topics/operation_spark_applications.html

https://spark.apache.org/docs/1.6.0/monitoring.html


Cheers!
Was your question answered? Make sure to mark the answer as the accepted solution.
If you find a reply useful, say thanks by clicking on the thumbs up button.