Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

Spark port binding issue

Highlighted

Spark port binding issue

Contributor

We are getting below errors when 15 or 16 spark jobs are running parallelly. We are having a 21 node cluster and running spark on yarn. Regardless of the number of nodes in the cluster does one cluster get to use only 17 ports or is it 17 ports per node in a cluster? How to avoid this when we run 50 or 100 spark jobs parallely?

 

WARN util.Utils: Service ‘SparkUI’ could not bind on port 4040. Attempting port 4041.

:::::

WARN util.Utils: Service ‘SparkUI’ could not bind on port 4055. Attempting port 4056.

Address alredy in use: Service ‘sparkUI’ failed after 16 retries! Consider explicitly setting the appropriate port for the service ‘SparkUI

1 REPLY 1

Re: Spark port binding issue

Rising Star
Hi Naveen,

If you have limited number of ports available. You can assign port for each application.

--conf "spark.driver.port=4050"
—conf "spark.executor.port=51001"
--conf "spark.ui.port=4005"

Hope it helps

Thanks
Jerry
Don't have an account?
Coming from Hortonworks? Activate your account here