Support Questions
Find answers, ask questions, and share your expertise

Spark port binding issue

Spark port binding issue


We are getting below errors when 15 or 16 spark jobs are running parallelly. We are having a 21 node cluster and running spark on yarn. Regardless of the number of nodes in the cluster does one cluster get to use only 17 ports or is it 17 ports per node in a cluster? How to avoid this when we run 50 or 100 spark jobs parallely?


WARN util.Utils: Service ‘SparkUI’ could not bind on port 4040. Attempting port 4041.


WARN util.Utils: Service ‘SparkUI’ could not bind on port 4055. Attempting port 4056.

Address alredy in use: Service ‘sparkUI’ failed after 16 retries! Consider explicitly setting the appropriate port for the service ‘SparkUI


Re: Spark port binding issue

Rising Star
Hi Naveen,

If you have limited number of ports available. You can assign port for each application.

--conf "spark.driver.port=4050"
—conf "spark.executor.port=51001"
--conf "spark.ui.port=4005"

Hope it helps