<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>question Issue with setting up spark clients without ambari in Archives of Support Questions (Read Only)</title>
    <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Issue-with-setting-up-spark-clients-without-ambari/m-p/114052#M42787</link>
    <description>&lt;P&gt;I am trying to evaluate sparklyr in a test machine with RStudio Server. Since the machine is outside the HDP cluster I installed the hadoop and spark clients and copied the config files our test hdp cluster in /etc/hadoop/conf . I set the HADOOP_CONF_DIR and YARN_CONF_DIR and SPARK_HOME to point to hdp files. Our hadoop cluster is integrated with Kerberos. I am able to run spark-shell on local mode and read hdfs files from the test cluster. I am not able to run spark-shell on yarn-client mode .&lt;/P&gt;&lt;P&gt;I am getting the following error in application log. &lt;/P&gt;&lt;PRE&gt;16/10/05 11:30:57 INFO yarn.ApplicationMaster: Waiting for Spark driver to be reachable.
16/10/05 11:32:00 ERROR yarn.ApplicationMaster: Failed to connect to driver at 10.100.99.100:42948, retrying ...
16/10/05 11:33:03 ERROR yarn.ApplicationMaster: Failed to connect to driver at 10.100.99.100:42948, retrying ...
16/10/05 11:33:03 ERROR yarn.ApplicationMaster: Uncaught exception: 
org.apache.spark.SparkException: Failed to connect to driver!&lt;/PRE&gt;&lt;P&gt;It is submitting the job and the job goto ACCEPTED state but not to RUNNING state. &lt;/P&gt;&lt;P&gt;6/10/05 10:43:24 INFO impl.YarnClientImpl: Submitted application application_1474880908029_0858&lt;/P&gt;&lt;P&gt;16/10/05 10:43:24 INFO cluster.SchedulerExtensionServices: Starting Yarn extension services with app application_1474880908029_0858 and attemptId None&lt;/P&gt;&lt;P&gt;16/10/05 10:43:25 INFO yarn.Client: Application report for application_1474880908029_0858 (state: ACCEPTED)&lt;/P&gt;&lt;P&gt;16/10/05 10:43:25 INFO yarn.Client: &lt;/P&gt;&lt;P&gt; client token: Token { kind: YARN_CLIENT_TOKEN, service:  }&lt;/P&gt;&lt;P&gt; diagnostics: N/A&lt;/P&gt;&lt;P&gt; ApplicationMaster host: N/A&lt;/P&gt;&lt;P&gt; ApplicationMaster RPC port: -1&lt;/P&gt;&lt;P&gt; queue: default&lt;/P&gt;&lt;P&gt; start time: 1475660604154&lt;/P&gt;&lt;P&gt; final status: UNDEFINED&lt;/P&gt;&lt;P&gt; tracking URL: &lt;A href="http://hostname:8088/proxy/application_1474880908029_0858/" target="_blank"&gt;http://hostname:8088/proxy/application_1474880908029_0858/&lt;/A&gt;&lt;/P&gt;&lt;P&gt; user: dee &lt;/P&gt;&lt;P&gt;16/10/05 10:43:26 INFO yarn.Client: Application report for application_1474880908029_0858 (state: ACCEPTED)&lt;/P&gt;&lt;P&gt;16/10/05 10:43:27 INFO yarn.Client: Application report for application_1474880908029_0858 (state: ACCEPTED)&lt;/P&gt;&lt;P&gt;16/10/05 10:43:28 INFO yarn.Client: Application report for application_1474880908029_0858 (state: ACCEPTED)&lt;/P&gt;&lt;P&gt;16/10/05 10:43:29 INFO yarn.Client: Application report for application_1474880908029_0858 (state: ACCEPTED)&lt;/P&gt;&lt;P&gt;Here is the application log. &lt;/P&gt;&lt;PRE&gt;16/10/05 11:30:57 INFO spark.SecurityManager: Changing view acls to: deesub
16/10/05 11:30:57 INFO spark.SecurityManager: Changing modify acls to: deesub
16/10/05 11:30:57 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(deesub); users with modify permissions: Set(deesub)
16/10/05 11:30:57 INFO yarn.ApplicationMaster: Waiting for Spark driver to be reachable.
16/10/05 11:32:00 ERROR yarn.ApplicationMaster: Failed to connect to driver at 10.100.99.100:42948, retrying ...
16/10/05 11:33:03 ERROR yarn.ApplicationMaster: Failed to connect to driver at 10.100.99.100:42948, retrying ...
16/10/05 11:33:03 ERROR yarn.ApplicationMaster: Uncaught exception: 
org.apache.spark.SparkException: Failed to connect to driver!
	at org.apache.spark.deploy.yarn.ApplicationMaster.waitForSparkDriver(ApplicationMaster.scala:501)
	at org.apache.spark.deploy.yarn.ApplicationMaster.runExecutorLauncher(ApplicationMaster.scala:362)
	at org.apache.spark.deploy.yarn.ApplicationMaster.run(ApplicationMaster.scala:204)
	at org.apache.spark.deploy.yarn.ApplicationMaster$anonfun$main$1.apply$mcV$sp(ApplicationMaster.scala:672)
	at org.apache.spark.deploy.SparkHadoopUtil$anon$1.run(SparkHadoopUtil.scala:69)
	at org.apache.spark.deploy.SparkHadoopUtil$anon$1.run(SparkHadoopUtil.scala:68)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:422)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1709)
	at org.apache.spark.deploy.SparkHadoopUtil.runAsSparkUser(SparkHadoopUtil.scala:68)
	at org.apache.spark.deploy.yarn.ApplicationMaster$.main(ApplicationMaster.scala:670)
	at org.apache.spark.deploy.yarn.ExecutorLauncher$.main(ApplicationMaster.scala:697)
	at org.apache.spark.deploy.yarn.ExecutorLauncher.main(ApplicationMaster.scala)
16/10/05 11:33:03 INFO yarn.ApplicationMaster: Final app status: FAILED, exitCode: 10, (reason: Uncaught exception: org.apache.spark.SparkException: Failed to connect to driver!)
16/10/05 11:33:03 INFO util.ShutdownHookManager: Shutdown hook called&lt;/PRE&gt;</description>
    <pubDate>Wed, 05 Oct 2016 17:12:39 GMT</pubDate>
    <dc:creator>deepak.subhramanian</dc:creator>
    <dc:date>2016-10-05T17:12:39Z</dc:date>
    <item>
      <title>Issue with setting up spark clients without ambari</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Issue-with-setting-up-spark-clients-without-ambari/m-p/114052#M42787</link>
      <description>&lt;P&gt;I am trying to evaluate sparklyr in a test machine with RStudio Server. Since the machine is outside the HDP cluster I installed the hadoop and spark clients and copied the config files our test hdp cluster in /etc/hadoop/conf . I set the HADOOP_CONF_DIR and YARN_CONF_DIR and SPARK_HOME to point to hdp files. Our hadoop cluster is integrated with Kerberos. I am able to run spark-shell on local mode and read hdfs files from the test cluster. I am not able to run spark-shell on yarn-client mode .&lt;/P&gt;&lt;P&gt;I am getting the following error in application log. &lt;/P&gt;&lt;PRE&gt;16/10/05 11:30:57 INFO yarn.ApplicationMaster: Waiting for Spark driver to be reachable.
16/10/05 11:32:00 ERROR yarn.ApplicationMaster: Failed to connect to driver at 10.100.99.100:42948, retrying ...
16/10/05 11:33:03 ERROR yarn.ApplicationMaster: Failed to connect to driver at 10.100.99.100:42948, retrying ...
16/10/05 11:33:03 ERROR yarn.ApplicationMaster: Uncaught exception: 
org.apache.spark.SparkException: Failed to connect to driver!&lt;/PRE&gt;&lt;P&gt;It is submitting the job and the job goto ACCEPTED state but not to RUNNING state. &lt;/P&gt;&lt;P&gt;6/10/05 10:43:24 INFO impl.YarnClientImpl: Submitted application application_1474880908029_0858&lt;/P&gt;&lt;P&gt;16/10/05 10:43:24 INFO cluster.SchedulerExtensionServices: Starting Yarn extension services with app application_1474880908029_0858 and attemptId None&lt;/P&gt;&lt;P&gt;16/10/05 10:43:25 INFO yarn.Client: Application report for application_1474880908029_0858 (state: ACCEPTED)&lt;/P&gt;&lt;P&gt;16/10/05 10:43:25 INFO yarn.Client: &lt;/P&gt;&lt;P&gt; client token: Token { kind: YARN_CLIENT_TOKEN, service:  }&lt;/P&gt;&lt;P&gt; diagnostics: N/A&lt;/P&gt;&lt;P&gt; ApplicationMaster host: N/A&lt;/P&gt;&lt;P&gt; ApplicationMaster RPC port: -1&lt;/P&gt;&lt;P&gt; queue: default&lt;/P&gt;&lt;P&gt; start time: 1475660604154&lt;/P&gt;&lt;P&gt; final status: UNDEFINED&lt;/P&gt;&lt;P&gt; tracking URL: &lt;A href="http://hostname:8088/proxy/application_1474880908029_0858/" target="_blank"&gt;http://hostname:8088/proxy/application_1474880908029_0858/&lt;/A&gt;&lt;/P&gt;&lt;P&gt; user: dee &lt;/P&gt;&lt;P&gt;16/10/05 10:43:26 INFO yarn.Client: Application report for application_1474880908029_0858 (state: ACCEPTED)&lt;/P&gt;&lt;P&gt;16/10/05 10:43:27 INFO yarn.Client: Application report for application_1474880908029_0858 (state: ACCEPTED)&lt;/P&gt;&lt;P&gt;16/10/05 10:43:28 INFO yarn.Client: Application report for application_1474880908029_0858 (state: ACCEPTED)&lt;/P&gt;&lt;P&gt;16/10/05 10:43:29 INFO yarn.Client: Application report for application_1474880908029_0858 (state: ACCEPTED)&lt;/P&gt;&lt;P&gt;Here is the application log. &lt;/P&gt;&lt;PRE&gt;16/10/05 11:30:57 INFO spark.SecurityManager: Changing view acls to: deesub
16/10/05 11:30:57 INFO spark.SecurityManager: Changing modify acls to: deesub
16/10/05 11:30:57 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(deesub); users with modify permissions: Set(deesub)
16/10/05 11:30:57 INFO yarn.ApplicationMaster: Waiting for Spark driver to be reachable.
16/10/05 11:32:00 ERROR yarn.ApplicationMaster: Failed to connect to driver at 10.100.99.100:42948, retrying ...
16/10/05 11:33:03 ERROR yarn.ApplicationMaster: Failed to connect to driver at 10.100.99.100:42948, retrying ...
16/10/05 11:33:03 ERROR yarn.ApplicationMaster: Uncaught exception: 
org.apache.spark.SparkException: Failed to connect to driver!
	at org.apache.spark.deploy.yarn.ApplicationMaster.waitForSparkDriver(ApplicationMaster.scala:501)
	at org.apache.spark.deploy.yarn.ApplicationMaster.runExecutorLauncher(ApplicationMaster.scala:362)
	at org.apache.spark.deploy.yarn.ApplicationMaster.run(ApplicationMaster.scala:204)
	at org.apache.spark.deploy.yarn.ApplicationMaster$anonfun$main$1.apply$mcV$sp(ApplicationMaster.scala:672)
	at org.apache.spark.deploy.SparkHadoopUtil$anon$1.run(SparkHadoopUtil.scala:69)
	at org.apache.spark.deploy.SparkHadoopUtil$anon$1.run(SparkHadoopUtil.scala:68)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:422)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1709)
	at org.apache.spark.deploy.SparkHadoopUtil.runAsSparkUser(SparkHadoopUtil.scala:68)
	at org.apache.spark.deploy.yarn.ApplicationMaster$.main(ApplicationMaster.scala:670)
	at org.apache.spark.deploy.yarn.ExecutorLauncher$.main(ApplicationMaster.scala:697)
	at org.apache.spark.deploy.yarn.ExecutorLauncher.main(ApplicationMaster.scala)
16/10/05 11:33:03 INFO yarn.ApplicationMaster: Final app status: FAILED, exitCode: 10, (reason: Uncaught exception: org.apache.spark.SparkException: Failed to connect to driver!)
16/10/05 11:33:03 INFO util.ShutdownHookManager: Shutdown hook called&lt;/PRE&gt;</description>
      <pubDate>Wed, 05 Oct 2016 17:12:39 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Issue-with-setting-up-spark-clients-without-ambari/m-p/114052#M42787</guid>
      <dc:creator>deepak.subhramanian</dc:creator>
      <dc:date>2016-10-05T17:12:39Z</dc:date>
    </item>
    <item>
      <title>Re: Issue with setting up spark clients without ambari</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Issue-with-setting-up-spark-clients-without-ambari/m-p/114053#M42788</link>
      <description>&lt;P&gt;Did you copy /etc/spark/conf/ ? Also, create spark user and copy its headless keytab. You can find all details &lt;A href="http://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.5.0/bk_command-line-installation/content/ch_installing_spark_chapter.html"&gt;here&lt;/A&gt;.&lt;/P&gt;</description>
      <pubDate>Wed, 05 Oct 2016 17:36:41 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Issue-with-setting-up-spark-clients-without-ambari/m-p/114053#M42788</guid>
      <dc:creator>pminovic</dc:creator>
      <dc:date>2016-10-05T17:36:41Z</dc:date>
    </item>
    <item>
      <title>Re: Issue with setting up spark clients without ambari</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Issue-with-setting-up-spark-clients-without-ambari/m-p/114054#M42789</link>
      <description>&lt;P&gt;I missed copying the spark conf. But spark user was created with yum install of spark clients. I also copied the keytab. But still the same error.&lt;/P&gt;&lt;PRE&gt;16/10/05 20:05:55 ERROR ApplicationMaster: Failed to connect to driver at 10.100.100.110:33656, retrying ...
16/10/05 20:06:58 ERROR ApplicationMaster: Failed to connect to driver at 10.100.100.110:33656, retrying ...
16/10/05 20:06:58 ERROR ApplicationMaster: Uncaught exception: 
org.apache.spark.SparkException: Failed to connect to driver!&lt;/PRE&gt;</description>
      <pubDate>Thu, 06 Oct 2016 06:00:19 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Issue-with-setting-up-spark-clients-without-ambari/m-p/114054#M42789</guid>
      <dc:creator>deepak.subhramanian</dc:creator>
      <dc:date>2016-10-06T06:00:19Z</dc:date>
    </item>
    <item>
      <title>Re: Issue with setting up spark clients without ambari</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Issue-with-setting-up-spark-clients-without-ambari/m-p/114055#M42790</link>
      <description>&lt;P&gt;You may have some network issues, if I understand the log correctly it is saying that the assigned Application master for your spark job cannot access the driver running on your external node. Can cluster nodes access your external node on ephemeral port numbers like 33656?&lt;/P&gt;</description>
      <pubDate>Thu, 06 Oct 2016 15:22:09 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Issue-with-setting-up-spark-clients-without-ambari/m-p/114055#M42790</guid>
      <dc:creator>pminovic</dc:creator>
      <dc:date>2016-10-06T15:22:09Z</dc:date>
    </item>
    <item>
      <title>Re: Issue with setting up spark clients without ambari</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Issue-with-setting-up-spark-clients-without-ambari/m-p/114056#M42791</link>
      <description>&lt;P&gt;Good point. It looks like it is a firewall issue. &lt;/P&gt;</description>
      <pubDate>Fri, 07 Oct 2016 18:08:21 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Issue-with-setting-up-spark-clients-without-ambari/m-p/114056#M42791</guid>
      <dc:creator>deepak.subhramanian</dc:creator>
      <dc:date>2016-10-07T18:08:21Z</dc:date>
    </item>
  </channel>
</rss>

