- Subscribe to RSS Feed
- Mark Question as New
- Mark Question as Read
- Float this Question for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page
Is it possible to get the cpu utilization of spark job programmatically using the application Id?
Created ‎08-19-2020 11:59 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi all,
I would like to know whether we can fetch the cpu utilization of a spark job programmatically using the application Id. Any help would be appreciated.
Thank you
Created ‎08-26-2020 01:22 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hello @KSKR ,
thank you for raising the question on "how to fetch the CPU utilization for a Spark job programmatically".
One way to do this is via the Spark REST API.
You should consider if you need the "live data" or you are looking for analysis once the application finished running.
While the application is running, you can consider to connect to the driver and fetch the live data. Once the application finished running, you can consider parse the JSON files (the event log files) for the CPU time or use the Spark REST API and let the Spark History Server serve you with the data.
What is your exact requirement? What would you like to achieve?
Thank you:
Ferenc
Ferenc Erdelyi, Technical Solutions Manager
Was your question answered? Make sure to mark the answer as the accepted solution.
If you find a reply useful, say thanks by clicking on the thumbs up button.
Learn more about the Cloudera Community:
