<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>question Re: Hive on Spark CDH 5.7 - Failed to create spark client in Archives of Support Questions (Read Only)</title>
    <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Hive-on-Spark-CDH-5-7-Failed-to-create-spark-client/m-p/40752#M25384</link>
    <description>What are your values for executors? And how have you figured out it was a memory issue?&lt;BR /&gt;&lt;BR /&gt;Thanks</description>
    <pubDate>Thu, 12 May 2016 05:18:35 GMT</pubDate>
    <dc:creator>dmitry_kniazev</dc:creator>
    <dc:date>2016-05-12T05:18:35Z</dc:date>
    <item>
      <title>Hive on Spark CDH 5.7 - Failed to create spark client</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Hive-on-Spark-CDH-5-7-Failed-to-create-spark-client/m-p/39831#M25382</link>
      <description>&lt;P&gt;I have enabled Spark as the default execution engine on Hive on CDH 5.7 but get the following when I execute a query against Hive from my edge node. Is there anything I need to enable on my client edge node. &amp;nbsp;I can run the spark-shell and have exported SPARK_HOME. &amp;nbsp;Also copied Client Config to edge node. &amp;nbsp;Is there anything else I need to enable/configure?&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;FONT size="2"&gt;ERROR : Failed to execute spark task, with exception 'org.apache.hadoop.hive.ql.metadata.HiveException(Failed to create spark client.)'&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="2"&gt;org.apache.hadoop.hive.ql.metadata.HiveException: Failed to create spark client.&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="2"&gt;at org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.open(SparkSessionImpl.java:64)&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="2"&gt;at org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionManagerImpl.getSession(SparkSessionManagerImpl.java:114)&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="2"&gt;at org.apache.hadoop.hive.ql.exec.spark.SparkUtilities.getSparkSession(SparkUtilities.java:125)&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="2"&gt;at org.apache.hadoop.hive.ql.exec.spark.SparkTask.execute(SparkTask.java:97)&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="2"&gt;at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:160)&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="2"&gt;at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:100)&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="2"&gt;at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1774)&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="2"&gt;at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1531)&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="2"&gt;at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1311)&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="2"&gt;at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1120)&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="2"&gt;at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1113)&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="2"&gt;at org.apache.hive.service.cli.operation.SQLOperation.runQuery(SQLOperation.java:178)&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="2"&gt;at org.apache.hive.service.cli.operation.SQLOperation.access$100(SQLOperation.java:72)&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="2"&gt;at org.apache.hive.service.cli.operation.SQLOperation$2$1.run(SQLOperation.java:232)&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="2"&gt;at java.security.AccessController.doPrivileged(Native Method)&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="2"&gt;at javax.security.auth.Subject.doAs(Subject.java:415)&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="2"&gt;at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1693)&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="2"&gt;at org.apache.hive.service.cli.operation.SQLOperation$2.run(SQLOperation.java:245)&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="2"&gt;at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="2"&gt;at java.util.concurrent.FutureTask.run(FutureTask.java:262)&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="2"&gt;at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="2"&gt;at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="2"&gt;at java.lang.Thread.run(Thread.java:745)&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="2"&gt;Caused by: java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Cancel client '478049ac-228c-4abb-8ef3-93157822a0a1'. Error: Child process exited before connecting back&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="2"&gt;at com.google.common.base.Throwables.propagate(Throwables.java:156)&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="2"&gt;at org.apache.hive.spark.client.SparkClientImpl.&amp;lt;init&amp;gt;(SparkClientImpl.java:111)&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="2"&gt;at org.apache.hive.spark.client.SparkClientFactory.createClient(SparkClientFactory.java:80)&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="2"&gt;at org.apache.hadoop.hive.ql.exec.spark.RemoteHiveSparkClient.createRemoteClient(RemoteHiveSparkClient.java:98)&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="2"&gt;at org.apache.hadoop.hive.ql.exec.spark.RemoteHiveSparkClient.&amp;lt;init&amp;gt;(RemoteHiveSparkClient.java:94)&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="2"&gt;at org.apache.hadoop.hive.ql.exec.spark.HiveSparkClientFactory.createHiveSparkClient(HiveSparkClientFactory.java:63)&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="2"&gt;at org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.open(SparkSessionImpl.java:62)&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="2"&gt;... 22 more&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="2"&gt;Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Cancel client '478049ac-228c-4abb-8ef3-93157822a0a1'. Error: Child process exited before connecting back&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="2"&gt;at io.netty.util.concurrent.AbstractFuture.get(AbstractFuture.java:37)&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="2"&gt;at org.apache.hive.spark.client.SparkClientImpl.&amp;lt;init&amp;gt;(SparkClientImpl.java:101)&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="2"&gt;... 27 more&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="2"&gt;Caused by: java.lang.RuntimeException: Cancel client '478049ac-228c-4abb-8ef3-93157822a0a1'. Error: Child process exited before connecting back&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="2"&gt;at org.apache.hive.spark.client.rpc.RpcServer.cancelClient(RpcServer.java:179)&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="2"&gt;at org.apache.hive.spark.client.SparkClientImpl$3.run(SparkClientImpl.java:450)&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="2"&gt;... 1 more&lt;/FONT&gt;&lt;/P&gt;</description>
      <pubDate>Fri, 16 Sep 2022 10:14:19 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Hive-on-Spark-CDH-5-7-Failed-to-create-spark-client/m-p/39831#M25382</guid>
      <dc:creator>shaileshCG</dc:creator>
      <dc:date>2022-09-16T10:14:19Z</dc:date>
    </item>
    <item>
      <title>Re: Hive on Spark CDH 5.7 - Failed to create spark client</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Hive-on-Spark-CDH-5-7-Failed-to-create-spark-client/m-p/39845#M25383</link>
      <description>&lt;P&gt;The YARN Container Memory was smaller than the Spark Executor requirement. &amp;nbsp;I set the YARN Container memory and maximum to be greater than Spark Executor Memory + Overhead. &amp;nbsp;Check&amp;nbsp;'yarn.scheduler.maximum-allocation-mb' and/or 'yarn.nodemanager.resource.memory-mb'.&lt;/P&gt;</description>
      <pubDate>Sun, 17 Apr 2016 19:15:26 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Hive-on-Spark-CDH-5-7-Failed-to-create-spark-client/m-p/39845#M25383</guid>
      <dc:creator>shaileshCG</dc:creator>
      <dc:date>2016-04-17T19:15:26Z</dc:date>
    </item>
    <item>
      <title>Re: Hive on Spark CDH 5.7 - Failed to create spark client</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Hive-on-Spark-CDH-5-7-Failed-to-create-spark-client/m-p/40752#M25384</link>
      <description>What are your values for executors? And how have you figured out it was a memory issue?&lt;BR /&gt;&lt;BR /&gt;Thanks</description>
      <pubDate>Thu, 12 May 2016 05:18:35 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Hive-on-Spark-CDH-5-7-Failed-to-create-spark-client/m-p/40752#M25384</guid>
      <dc:creator>dmitry_kniazev</dc:creator>
      <dc:date>2016-05-12T05:18:35Z</dc:date>
    </item>
    <item>
      <title>Re: Hive on Spark CDH 5.7 - Failed to create spark client</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Hive-on-Spark-CDH-5-7-Failed-to-create-spark-client/m-p/40760#M25385</link>
      <description>&lt;P&gt;The YARN logs contained errors that complained about the memory deficiencies when I selected the Spark Engine for Hive.&amp;nbsp; And I noticed the Executor Memory Size + Overhead for Spark defaults was larger than the YARN container memory settings.&amp;nbsp; Increasing the YARN Container Memory configuration cured the problem or alternatively you could lower the Spark Executor requirements.&lt;/P&gt;</description>
      <pubDate>Thu, 12 May 2016 08:51:15 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Hive-on-Spark-CDH-5-7-Failed-to-create-spark-client/m-p/40760#M25385</guid>
      <dc:creator>shaileshCG</dc:creator>
      <dc:date>2016-05-12T08:51:15Z</dc:date>
    </item>
    <item>
      <title>Re: Hive on Spark CDH 5.7 - Failed to create spark client</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Hive-on-Spark-CDH-5-7-Failed-to-create-spark-client/m-p/40837#M25386</link>
      <description>&lt;P&gt;Still not working for me... I have played with multiple parameters, but no scussess. Also, yarn logs do not show anything bad about memory. Any ideas?&lt;/P&gt;</description>
      <pubDate>Sun, 15 May 2016 00:38:33 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Hive-on-Spark-CDH-5-7-Failed-to-create-spark-client/m-p/40837#M25386</guid>
      <dc:creator>dmitry_kniazev</dc:creator>
      <dc:date>2016-05-15T00:38:33Z</dc:date>
    </item>
    <item>
      <title>Re: Hive on Spark CDH 5.7 - Failed to create spark client</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Hive-on-Spark-CDH-5-7-Failed-to-create-spark-client/m-p/40911#M25387</link>
      <description>&lt;P&gt;When you say it is not working, what issue does it exhibit? &amp;nbsp;For Hive on Spark you only need set the Execution Engine within Hive from MapReduce to Spark. &amp;nbsp;You do need to consider Spark memory setting for Executors in the Spark Service and these must correlate to the YARN container memory settings. &amp;nbsp;Generally I set the following YARN container settings:&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;yarn.nodemanager.resource.memory-mb&lt;/P&gt;&lt;P&gt;yarn.scheduler.maximum-allocation-mb&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;To be the same value but greater than the Spark Executor Memory + Overhead . &amp;nbsp;Check also for the following similar error in the YARN logs:&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;FONT size="2"&gt;15/09/17 11:15:09 INFO yarn.Client: Verifying our application has not requested more than the maximum memory capability of the cluster (2211 MB per container)&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="2"&gt;Exception in thread "main" java.lang.IllegalArgumentException: Required executor memory (2048+384 MB) is above the max threshold (2211 MB) of this cluster!&lt;/FONT&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;FONT size="2"&gt;Regards&lt;/FONT&gt;&lt;/P&gt;&lt;P&gt;&lt;FONT size="2"&gt;Shailesh&lt;/FONT&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Tue, 17 May 2016 09:42:01 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Hive-on-Spark-CDH-5-7-Failed-to-create-spark-client/m-p/40911#M25387</guid>
      <dc:creator>shaileshCG</dc:creator>
      <dc:date>2016-05-17T09:42:01Z</dc:date>
    </item>
  </channel>
</rss>

