<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>question Re: Spark port binding issue in Support Questions</title>
    <link>https://community.cloudera.com/t5/Support-Questions/Spark-port-binding-issue/m-p/87661#M21510</link>
    <description>Hi Naveen,&lt;BR /&gt;&lt;BR /&gt;If you have limited number of ports available. You can assign port for each application.&lt;BR /&gt;&lt;BR /&gt;--conf "spark.driver.port=4050"&lt;BR /&gt;—conf "spark.executor.port=51001"&lt;BR /&gt;--conf "spark.ui.port=4005"&lt;BR /&gt;&lt;BR /&gt;Hope it helps&lt;BR /&gt;&lt;BR /&gt;Thanks&lt;BR /&gt;Jerry</description>
    <pubDate>Tue, 12 Mar 2019 16:26:36 GMT</pubDate>
    <dc:creator>Jerry</dc:creator>
    <dc:date>2019-03-12T16:26:36Z</dc:date>
    <item>
      <title>Spark port binding issue</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Spark-port-binding-issue/m-p/87491#M21509</link>
      <description>&lt;P&gt;We are getting below errors when 15 or 16 spark jobs are running parallelly. We are having a 21 node cluster and running spark on yarn. Regardless of the number of nodes in the cluster does one cluster get to use only 17 ports or is it 17 ports per node in a cluster? How to avoid this when we run 50 or 100 spark jobs parallely?&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;WARN util.Utils: Service ‘SparkUI’ could not bind on port 4040. Attempting port 4041.&lt;/P&gt;&lt;P&gt;:::::&lt;/P&gt;&lt;P&gt;WARN util.Utils: Service ‘SparkUI’ could not bind on port 4055. Attempting port 4056.&lt;/P&gt;&lt;P&gt;Address alredy in use: Service ‘sparkUI’ failed after 16 retries! Consider explicitly setting the appropriate port for the service ‘SparkUI&lt;/P&gt;</description>
      <pubDate>Fri, 16 Sep 2022 14:13:09 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Spark-port-binding-issue/m-p/87491#M21509</guid>
      <dc:creator>naveen1</dc:creator>
      <dc:date>2022-09-16T14:13:09Z</dc:date>
    </item>
    <item>
      <title>Re: Spark port binding issue</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Spark-port-binding-issue/m-p/87661#M21510</link>
      <description>Hi Naveen,&lt;BR /&gt;&lt;BR /&gt;If you have limited number of ports available. You can assign port for each application.&lt;BR /&gt;&lt;BR /&gt;--conf "spark.driver.port=4050"&lt;BR /&gt;—conf "spark.executor.port=51001"&lt;BR /&gt;--conf "spark.ui.port=4005"&lt;BR /&gt;&lt;BR /&gt;Hope it helps&lt;BR /&gt;&lt;BR /&gt;Thanks&lt;BR /&gt;Jerry</description>
      <pubDate>Tue, 12 Mar 2019 16:26:36 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Spark-port-binding-issue/m-p/87661#M21510</guid>
      <dc:creator>Jerry</dc:creator>
      <dc:date>2019-03-12T16:26:36Z</dc:date>
    </item>
  </channel>
</rss>

