- Subscribe to RSS Feed
- Mark Question as New
- Mark Question as Read
- Float this Question for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page
How to know which Node is Driver Node, which Node is Worker Node for Spark in Cloudera?
- Labels:
-
Apache Spark
Created ‎02-01-2023 05:49 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I'm using a tool in which I have to point out the master node (driver node) of the Cloudera Spark Cluster (spark :// <some-spark-master> : 7077). Also as I learned, Spark has "Master Node", "Driver Node" and "Worker Nodes".
So I decided to go to the Cloudera Web Manager and checked the Configuration Tab of the Spark service, but all I found are "Gateway instance" and "History Server instance". Where are the "Driver instance" and "Worker instance"? I can't add these two instances in the "Add Role Instances" too
My guess is that it's in Yarn service configuration, but I can't find anything related to "Master", "Driver" or "Worker" either.
So what is the link to "Spark Master" that ends with 7077 (what is the Node)? I can't find it anywhere in the Configuration tab
Created ‎03-30-2023 04:31 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Cloudera will support YARN and Kubernets deployment mode and it will not support Standalone mode (In standalone mode you can access the Spark Master using 7077 port).
In order to check which node driver is launched and which node is executor is launched you need to go to Spark UI or Spark History Server UI of that application. From there go to Executors tab. You can see list of executors.
In the second table you find executor id. Where the executor id is 'driver' that is the one Driver Node and remaining all are executors.
Created ‎03-30-2023 04:31 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Cloudera will support YARN and Kubernets deployment mode and it will not support Standalone mode (In standalone mode you can access the Spark Master using 7077 port).
In order to check which node driver is launched and which node is executor is launched you need to go to Spark UI or Spark History Server UI of that application. From there go to Executors tab. You can see list of executors.
In the second table you find executor id. Where the executor id is 'driver' that is the one Driver Node and remaining all are executors.
