<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>question Re: Error while initiating spark shell in Support Questions</title>
    <link>https://community.cloudera.com/t5/Support-Questions/Error-while-initiating-spark-shell/m-p/300674#M220348</link>
    <description>&lt;P&gt;I used to work at Cloudera/Hortonworks, and now I am a Hashmap Inc. consultant. This solution worked perfectly, thank you.&lt;/P&gt;</description>
    <pubDate>Fri, 31 Jul 2020 15:14:11 GMT</pubDate>
    <dc:creator>BigDataBear</dc:creator>
    <dc:date>2020-07-31T15:14:11Z</dc:date>
    <item>
      <title>Error while initiating spark shell</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Error-while-initiating-spark-shell/m-p/293604#M216775</link>
      <description>&lt;P&gt;Hi friends,&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;I have cloudera trail version 6.2. In the command prompt when i tried to initiate spark shell using&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;spark-shell, im getting the below error:&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;[root@cloudera tmp]# spark-shell&lt;BR /&gt;Setting default log level to "WARN".&lt;BR /&gt;To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).&lt;BR /&gt;20/04/09 08:19:33 ERROR spark.SparkContext: Error initializing SparkContext.&lt;BR /&gt;java.lang.IllegalArgumentException: Required executor memory (1024), overhead (384 MB), and PySpark memory (0 MB) is above the max threshold (1024 MB) of this cluster! Please check the values of 'yarn.scheduler.maximum-allocation-mb' and/or 'yarn.nodemanager.resource.memory-mb'.&lt;BR /&gt;at org.apache.spark.deploy.yarn.Client.verifyClusterResources(Client.scala:345)&lt;BR /&gt;at org.apache.spark.deploy.yarn.Client.submitApplication(Client.scala:179)&lt;BR /&gt;at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(YarnClientSchedulerBackend.scala:60)&lt;BR /&gt;at org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:184)&lt;BR /&gt;at org.apache.spark.SparkContext.&amp;lt;init&amp;gt;(SparkContext.scala:511)&lt;BR /&gt;at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2549)&lt;BR /&gt;at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:944)&lt;BR /&gt;at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:935)&lt;BR /&gt;at scala.Option.getOrElse(Option.scala:121)&lt;BR /&gt;at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:935)&lt;BR /&gt;at org.apache.spark.repl.Main$.createSparkSession(Main.scala:106)&lt;BR /&gt;at $line3.$read$$iw$$iw.&amp;lt;init&amp;gt;(&amp;lt;console&amp;gt;:15)&lt;BR /&gt;at $line3.$read$$iw.&amp;lt;init&amp;gt;(&amp;lt;console&amp;gt;:43)&lt;BR /&gt;at $line3.$read.&amp;lt;init&amp;gt;(&amp;lt;console&amp;gt;:45)&lt;BR /&gt;at $line3.$read$.&amp;lt;init&amp;gt;(&amp;lt;console&amp;gt;:49)&lt;BR /&gt;at $line3.$read$.&amp;lt;clinit&amp;gt;(&amp;lt;console&amp;gt;)&lt;BR /&gt;at $line3.$eval$.$print$lzycompute(&amp;lt;console&amp;gt;:7)&lt;BR /&gt;at $line3.$eval$.$print(&amp;lt;console&amp;gt;:6)&lt;BR /&gt;at $line3.$eval.$print(&amp;lt;console&amp;gt;)&lt;BR /&gt;at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)&lt;BR /&gt;at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)&lt;BR /&gt;at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)&lt;BR /&gt;at java.lang.reflect.Method.invoke(Method.java:498)&lt;BR /&gt;at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:793)&lt;BR /&gt;at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1054)&lt;BR /&gt;at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:645)&lt;BR /&gt;at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:644)&lt;BR /&gt;at scala.reflect.internal.util.ScalaClassLoader$class.asContext(ScalaClassLoader.scala:31)&lt;BR /&gt;at scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:19)&lt;BR /&gt;at scala.tools.nsc.interpreter.IMain$WrappedRequest.loadAndRunReq(IMain.scala:644)&lt;BR /&gt;at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:576)&lt;BR /&gt;at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:572)&lt;BR /&gt;at scala.tools.nsc.interpreter.IMain$$anonfun$quietRun$1.apply(IMain.scala:231)&lt;BR /&gt;at scala.tools.nsc.interpreter.IMain$$anonfun$quietRun$1.apply(IMain.scala:231)&lt;BR /&gt;at scala.tools.nsc.interpreter.IMain.beQuietDuring(IMain.scala:221)&lt;BR /&gt;at scala.tools.nsc.interpreter.IMain.quietRun(IMain.scala:231)&lt;BR /&gt;at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1$$anonfun$apply$mcV$sp$1.apply(SparkILoop.scala:109)&lt;BR /&gt;at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1$$anonfun$apply$mcV$sp$1.apply(SparkILoop.scala:109)&lt;BR /&gt;at scala.collection.immutable.List.foreach(List.scala:392)&lt;BR /&gt;at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply$mcV$sp(SparkILoop.scala:109)&lt;BR /&gt;at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:109)&lt;BR /&gt;at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:109)&lt;BR /&gt;at scala.tools.nsc.interpreter.ILoop.savingReplayStack(ILoop.scala:91)&lt;BR /&gt;at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:108)&lt;BR /&gt;at org.apache.spark.repl.SparkILoop$$anonfun$process$1$$anonfun$org$apache$spark$repl$SparkILoop$$anonfun$$loopPostInit$1$1.apply$mcV$sp(SparkILoop.scala:211)&lt;BR /&gt;at org.apache.spark.repl.SparkILoop$$anonfun$process$1$$anonfun$org$apache$spark$repl$SparkILoop$$anonfun$$loopPostInit$1$1.apply(SparkILoop.scala:199)&lt;BR /&gt;at org.apache.spark.repl.SparkILoop$$anonfun$process$1$$anonfun$org$apache$spark$repl$SparkILoop$$anonfun$$loopPostInit$1$1.apply(SparkILoop.scala:199)&lt;BR /&gt;at scala.tools.nsc.interpreter.ILoop$$anonfun$mumly$1.apply(ILoop.scala:189)&lt;BR /&gt;at scala.tools.nsc.interpreter.IMain.beQuietDuring(IMain.scala:221)&lt;BR /&gt;at scala.tools.nsc.interpreter.ILoop.mumly(ILoop.scala:186)&lt;BR /&gt;at org.apache.spark.repl.SparkILoop$$anonfun$process$1.org$apache$spark$repl$SparkILoop$$anonfun$$loopPostInit$1(SparkILoop.scala:199)&lt;BR /&gt;at org.apache.spark.repl.SparkILoop$$anonfun$process$1$$anonfun$startup$1$1.apply(SparkILoop.scala:267)&lt;BR /&gt;at org.apache.spark.repl.SparkILoop$$anonfun$process$1$$anonfun$startup$1$1.apply(SparkILoop.scala:247)&lt;BR /&gt;at org.apache.spark.repl.SparkILoop$$anonfun$process$1.withSuppressedSettings$1(SparkILoop.scala:235)&lt;BR /&gt;at org.apache.spark.repl.SparkILoop$$anonfun$process$1.startup$1(SparkILoop.scala:247)&lt;BR /&gt;at org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply$mcZ$sp(SparkILoop.scala:282)&lt;BR /&gt;at org.apache.spark.repl.SparkILoop.runClosure(SparkILoop.scala:159)&lt;BR /&gt;at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:182)&lt;BR /&gt;at org.apache.spark.repl.Main$.doMain(Main.scala:78)&lt;BR /&gt;at org.apache.spark.repl.Main$.main(Main.scala:58)&lt;BR /&gt;at org.apache.spark.repl.Main.main(Main.scala)&lt;BR /&gt;at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)&lt;BR /&gt;at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)&lt;BR /&gt;at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)&lt;BR /&gt;at java.lang.reflect.Method.invoke(Method.java:498)&lt;BR /&gt;at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)&lt;BR /&gt;at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:851)&lt;BR /&gt;at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:167)&lt;BR /&gt;at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:195)&lt;BR /&gt;at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)&lt;BR /&gt;at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:926)&lt;BR /&gt;at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:935)&lt;BR /&gt;at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)&lt;BR /&gt;20/04/09 08:19:33 WARN cluster.YarnSchedulerBackend$YarnSchedulerEndpoint: Attempted to request executors before the AM has registered!&lt;BR /&gt;20/04/09 08:19:33 WARN metrics.MetricsSystem: Stopping a MetricsSystem that is not running&lt;BR /&gt;20/04/09 08:19:33 ERROR repl.Main: Failed to initialize Spark session.&lt;BR /&gt;java.lang.IllegalArgumentException: Required executor memory (1024), overhead (384 MB), and PySpark memory (0 MB) is above the max threshold (1024 MB) of this cluster! Please check the values of 'yarn.scheduler.maximum-allocation-mb' and/or 'yarn.nodemanager.resource.memory-mb'.&lt;BR /&gt;at org.apache.spark.deploy.yarn.Client.verifyClusterResources(Client.scala:345)&lt;BR /&gt;at org.apache.spark.deploy.yarn.Client.submitApplication(Client.scala:179)&lt;BR /&gt;at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(YarnClientSchedulerBackend.scala:60)&lt;BR /&gt;at org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:184)&lt;BR /&gt;at org.apache.spark.SparkContext.&amp;lt;init&amp;gt;(SparkContext.scala:511)&lt;BR /&gt;at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2549)&lt;BR /&gt;at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:944)&lt;BR /&gt;at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:935)&lt;BR /&gt;at scala.Option.getOrElse(Option.scala:121)&lt;BR /&gt;at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:935)&lt;BR /&gt;at org.apache.spark.repl.Main$.createSparkSession(Main.scala:106)&lt;BR /&gt;at $line3.$read$$iw$$iw.&amp;lt;init&amp;gt;(&amp;lt;console&amp;gt;:15)&lt;BR /&gt;at $line3.$read$$iw.&amp;lt;init&amp;gt;(&amp;lt;console&amp;gt;:43)&lt;BR /&gt;at $line3.$read.&amp;lt;init&amp;gt;(&amp;lt;console&amp;gt;:45)&lt;BR /&gt;at $line3.$read$.&amp;lt;init&amp;gt;(&amp;lt;console&amp;gt;:49)&lt;BR /&gt;at $line3.$read$.&amp;lt;clinit&amp;gt;(&amp;lt;console&amp;gt;)&lt;BR /&gt;at $line3.$eval$.$print$lzycompute(&amp;lt;console&amp;gt;:7)&lt;BR /&gt;at $line3.$eval$.$print(&amp;lt;console&amp;gt;:6)&lt;BR /&gt;at $line3.$eval.$print(&amp;lt;console&amp;gt;)&lt;BR /&gt;at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)&lt;BR /&gt;at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)&lt;BR /&gt;at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)&lt;BR /&gt;at java.lang.reflect.Method.invoke(Method.java:498)&lt;BR /&gt;at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:793)&lt;BR /&gt;at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1054)&lt;BR /&gt;at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:645)&lt;BR /&gt;at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:644)&lt;BR /&gt;at scala.reflect.internal.util.ScalaClassLoader$class.asContext(ScalaClassLoader.scala:31)&lt;BR /&gt;at scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:19)&lt;BR /&gt;at scala.tools.nsc.interpreter.IMain$WrappedRequest.loadAndRunReq(IMain.scala:644)&lt;BR /&gt;at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:576)&lt;BR /&gt;at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:572)&lt;BR /&gt;at scala.tools.nsc.interpreter.IMain$$anonfun$quietRun$1.apply(IMain.scala:231)&lt;BR /&gt;at scala.tools.nsc.interpreter.IMain$$anonfun$quietRun$1.apply(IMain.scala:231)&lt;BR /&gt;at scala.tools.nsc.interpreter.IMain.beQuietDuring(IMain.scala:221)&lt;BR /&gt;at scala.tools.nsc.interpreter.IMain.quietRun(IMain.scala:231)&lt;BR /&gt;at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1$$anonfun$apply$mcV$sp$1.apply(SparkILoop.scala:109)&lt;BR /&gt;at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1$$anonfun$apply$mcV$sp$1.apply(SparkILoop.scala:109)&lt;BR /&gt;at scala.collection.immutable.List.foreach(List.scala:392)&lt;BR /&gt;at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply$mcV$sp(SparkILoop.scala:109)&lt;BR /&gt;at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:109)&lt;BR /&gt;at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:109)&lt;BR /&gt;at scala.tools.nsc.interpreter.ILoop.savingReplayStack(ILoop.scala:91)&lt;BR /&gt;at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:108)&lt;BR /&gt;at org.apache.spark.repl.SparkILoop$$anonfun$process$1$$anonfun$org$apache$spark$repl$SparkILoop$$anonfun$$loopPostInit$1$1.apply$mcV$sp(SparkILoop.scala:211)&lt;BR /&gt;at org.apache.spark.repl.SparkILoop$$anonfun$process$1$$anonfun$org$apache$spark$repl$SparkILoop$$anonfun$$loopPostInit$1$1.apply(SparkILoop.scala:199)&lt;BR /&gt;at org.apache.spark.repl.SparkILoop$$anonfun$process$1$$anonfun$org$apache$spark$repl$SparkILoop$$anonfun$$loopPostInit$1$1.apply(SparkILoop.scala:199)&lt;BR /&gt;at scala.tools.nsc.interpreter.ILoop$$anonfun$mumly$1.apply(ILoop.scala:189)&lt;BR /&gt;at scala.tools.nsc.interpreter.IMain.beQuietDuring(IMain.scala:221)&lt;BR /&gt;at scala.tools.nsc.interpreter.ILoop.mumly(ILoop.scala:186)&lt;BR /&gt;at org.apache.spark.repl.SparkILoop$$anonfun$process$1.org$apache$spark$repl$SparkILoop$$anonfun$$loopPostInit$1(SparkILoop.scala:199)&lt;BR /&gt;at org.apache.spark.repl.SparkILoop$$anonfun$process$1$$anonfun$startup$1$1.apply(SparkILoop.scala:267)&lt;BR /&gt;at org.apache.spark.repl.SparkILoop$$anonfun$process$1$$anonfun$startup$1$1.apply(SparkILoop.scala:247)&lt;BR /&gt;at org.apache.spark.repl.SparkILoop$$anonfun$process$1.withSuppressedSettings$1(SparkILoop.scala:235)&lt;BR /&gt;at org.apache.spark.repl.SparkILoop$$anonfun$process$1.startup$1(SparkILoop.scala:247)&lt;BR /&gt;at org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply$mcZ$sp(SparkILoop.scala:282)&lt;BR /&gt;at org.apache.spark.repl.SparkILoop.runClosure(SparkILoop.scala:159)&lt;BR /&gt;at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:182)&lt;BR /&gt;at org.apache.spark.repl.Main$.doMain(Main.scala:78)&lt;BR /&gt;at org.apache.spark.repl.Main$.main(Main.scala:58)&lt;BR /&gt;at org.apache.spark.repl.Main.main(Main.scala)&lt;BR /&gt;at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)&lt;BR /&gt;at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)&lt;BR /&gt;at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)&lt;BR /&gt;at java.lang.reflect.Method.invoke(Method.java:498)&lt;BR /&gt;at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)&lt;BR /&gt;at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:851)&lt;BR /&gt;at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:167)&lt;BR /&gt;at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:195)&lt;BR /&gt;at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)&lt;BR /&gt;at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:926)&lt;BR /&gt;at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:935)&lt;BR /&gt;at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Not sure, the reason besides above error. Kindly help me out.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Regards,&lt;/P&gt;
&lt;P&gt;GTA&lt;/P&gt;</description>
      <pubDate>Thu, 09 Apr 2020 13:06:36 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Error-while-initiating-spark-shell/m-p/293604#M216775</guid>
      <dc:creator>GTA</dc:creator>
      <dc:date>2020-04-09T13:06:36Z</dc:date>
    </item>
    <item>
      <title>Re: Error while initiating spark shell</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Error-while-initiating-spark-shell/m-p/293957#M216960</link>
      <description>&lt;P&gt;&lt;FONT face="times new roman,times,serif"&gt;Hey&amp;nbsp;&lt;a href="https://community.cloudera.com/t5/user/viewprofilepage/user-id/75045"&gt;@GTA&lt;/a&gt;,&lt;/FONT&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;FONT face="times new roman,times,serif"&gt;Thanks for reaching out to the Cloudera community.&lt;/FONT&gt;&lt;/P&gt;&lt;P&gt;&lt;BR /&gt;&lt;FONT face="times new roman,times,serif"&gt;"Required executor memory (1024), overhead (384 MB), and PySpark memory (0 MB) is above the max threshold (1024 MB) of this cluster!"&lt;/FONT&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;FONT face="times new roman,times,serif"&gt;&amp;gt;&amp;gt; This issue occurs when the total memory required to run a spark executor in a container (Spark executor memory -&amp;gt; spark.executor.memory + Spark executor memory overhead: spark.yarn.executor.memoryOverhead) exceeds the memory available for running containers on the NodeManager (yarn.nodemanager.resource.memory-mb) node.&lt;/FONT&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;FONT face="times new roman,times,serif"&gt;Based on the above exception you have 1 GB configured by default for a spark executor, the overhead is by default 384 MB, the total memory required to run the container is 1024+384 MB = 1408 MB.&lt;/FONT&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;FONT face="times new roman,times,serif"&gt;As the NM was configured with not enough memory to even run a single container (only 1024 MB), this resulted in a valid exception.&lt;/FONT&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;FONT face="times new roman,times,serif"&gt;Increasing the NM settings from 1251 to 2048 MB will definitely allow a single container to run on the NM node. Use the mentioned steps to increase "yarn.nodemanager.resource.memory-mb" parameter to resolve this.&lt;/FONT&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;FONT face="times new roman,times,serif"&gt;Cloudera Manager &amp;gt;&amp;gt; YARN &amp;gt;&amp;gt; Configurations &amp;gt;&amp;gt; Search "yarn.nodemanager.resource.memory-mb" &amp;gt;&amp;gt; Configure 2048 MB or higher &amp;gt;&amp;gt; Save &amp;amp; Restart.&lt;/FONT&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;FONT face="times new roman,times,serif"&gt;Let me know if this helps.&lt;/FONT&gt;&lt;/P&gt;</description>
      <pubDate>Tue, 14 Apr 2020 11:45:00 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Error-while-initiating-spark-shell/m-p/293957#M216960</guid>
      <dc:creator>TonyStank</dc:creator>
      <dc:date>2020-04-14T11:45:00Z</dc:date>
    </item>
    <item>
      <title>Re: Error while initiating spark shell</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Error-while-initiating-spark-shell/m-p/294097#M217052</link>
      <description>&lt;P&gt;Thanks a lot for your reply and for your solution Tony:-)&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Regards,&lt;/P&gt;&lt;P&gt;GTA&lt;/P&gt;</description>
      <pubDate>Thu, 16 Apr 2020 08:02:46 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Error-while-initiating-spark-shell/m-p/294097#M217052</guid>
      <dc:creator>GTA</dc:creator>
      <dc:date>2020-04-16T08:02:46Z</dc:date>
    </item>
    <item>
      <title>Re: Error while initiating spark shell</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Error-while-initiating-spark-shell/m-p/300674#M220348</link>
      <description>&lt;P&gt;I used to work at Cloudera/Hortonworks, and now I am a Hashmap Inc. consultant. This solution worked perfectly, thank you.&lt;/P&gt;</description>
      <pubDate>Fri, 31 Jul 2020 15:14:11 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Error-while-initiating-spark-shell/m-p/300674#M220348</guid>
      <dc:creator>BigDataBear</dc:creator>
      <dc:date>2020-07-31T15:14:11Z</dc:date>
    </item>
  </channel>
</rss>

