Created 08-23-2016 04:00 PM
I have upgraded HDP from 2.3 to 2.4 in TEST environment and I see there is a change in spark between these versions.
How can I get the list of jobs that uses spark currently ?
Created 08-26-2016 07:23 PM
Your question caption asked about dependent components. Your question description asked about list of jobs that use spark currently. I assume that you mean you actually meant spark applications (AKA jobs) running on the cluster. If you have access to Ambari, you could click on Yarn link then on Quick Links and then on Resource Manager UI. That assumes your Spark runs over Yarn. Otherwise, you could go directly to Resource Manager UI. You would need to know the IP Address of the server where ResourceManager runs, as well as the port. Default is 8088.
Created 08-26-2016 07:23 PM
Your question caption asked about dependent components. Your question description asked about list of jobs that use spark currently. I assume that you mean you actually meant spark applications (AKA jobs) running on the cluster. If you have access to Ambari, you could click on Yarn link then on Quick Links and then on Resource Manager UI. That assumes your Spark runs over Yarn. Otherwise, you could go directly to Resource Manager UI. You would need to know the IP Address of the server where ResourceManager runs, as well as the port. Default is 8088.