Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

Calculating CPU time for a Spark Job

Highlighted

Calculating CPU time for a Spark Job

We are using Spark 1.5 and would like to find out the CPU time taken for a specific job. I don’t see this metric in the Spark History UI. Is there anywhere we store this metric ?

2 REPLIES 2

Re: Calculating CPU time for a Spark Job

Expert Contributor

Not sure it’s exactly what you’re looking for, but in YARN history server you can get the Aggregate Resource Allocation for Spark Jobs:

EX:

Aggregate Resource Allocation:

13219453 MB-seconds, 8604 vcore-seconds

Re: Calculating CPU time for a Spark Job

Seems like there's a project called org.wisdom-framework mentioned at http://stackoverflow.com/questions/35801271/spark-cpu-utilization-monitoring that might be helpful. You can always get the full YARN job's CPU statistics, but that's a composite.