- Subscribe to RSS Feed
- Mark Question as New
- Mark Question as Read
- Float this Question for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page
Spark port binding issue
- Labels:
-
Apache Spark
-
Apache YARN
Created on ‎03-08-2019 06:29 AM - edited ‎09-16-2022 07:13 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
We are getting below errors when 15 or 16 spark jobs are running parallelly. We are having a 21 node cluster and running spark on yarn. Regardless of the number of nodes in the cluster does one cluster get to use only 17 ports or is it 17 ports per node in a cluster? How to avoid this when we run 50 or 100 spark jobs parallely?
WARN util.Utils: Service ‘SparkUI’ could not bind on port 4040. Attempting port 4041.
:::::
WARN util.Utils: Service ‘SparkUI’ could not bind on port 4055. Attempting port 4056.
Address alredy in use: Service ‘sparkUI’ failed after 16 retries! Consider explicitly setting the appropriate port for the service ‘SparkUI
Created ‎03-12-2019 09:26 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
If you have limited number of ports available. You can assign port for each application.
--conf "spark.driver.port=4050"
—conf "spark.executor.port=51001"
--conf "spark.ui.port=4005"
Hope it helps
Thanks
Jerry
