Support Questions

Find answers, ask questions, and share your expertise

spark-shell --master yarn error in HDP 2.6.5

avatar
Expert Contributor

In HDP 2.6.5 I tried  

 

 

 

[root@sandbox-hdp ~]# spark-shell --master yarn

 

 

 

But got following errors. It opens scala shell but cannot create spark session.

 

 

 

Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
19/10/19 06:04:17 ERROR YarnClientSchedulerBackend: Yarn application has already exited with state FAILED!
19/10/19 06:04:17 ERROR TransportClient: Failed to send RPC 5080948039683175202 to /172.18.0.2:52542: java.nio.channels.ClosedChannelException
java.nio.channels.ClosedChannelException
        at io.netty.channel.AbstractChannel$AbstractUnsafe.write(...)(Unknown Source)
19/10/19 06:04:17 ERROR YarnSchedulerBackend$YarnSchedulerEndpoint: Sending RequestExecutors(0,0,Map(),Set()) to AM was unsuccessful
java.io.IOException: Failed to send RPC 5080948039683175202 to /172.18.0.2:52542: java.nio.channels.ClosedChannelException
....
....
Caused by: java.nio.channels.ClosedChannelException
...
19/10/19 06:04:17 ERROR Utils: Uncaught exception in thread Yarn application state monitor
org.apache.spark.SparkException: Exception thrown in awaitResult:
...
Caused by: java.io.IOException: Failed to send RPC 5080948039683175202 to /172.18.0.2:52542: java.nio.channels.ClosedChannelException
...
19/10/19 06:04:17 ERROR SparkContext: Error initializing SparkContext.
java.lang.IllegalStateException: Spark context stopped while waiting for backend
...
console>:14: error: not found: value spark
       import spark.implicits._
              ^
<console>:14: error: not found: value spark
       import spark.sql
    

 

 

Then I looked at yarn logs:

 

19/10/19 06:21:35 INFO RMProxy: Connecting to ResourceManager at sandbox-hdp.hortonworks.com/172.18.0.2:8030
19/10/19 06:21:35 INFO YarnRMClient: Registering the ApplicationMaster
19/10/19 06:21:35 INFO YarnAllocator: Will request 2 executor container(s), each with 1 core(s) and 1408 MB memory (including 384 MB of overhead)
....
19/10/19 06:21:35 INFO RMProxy: Connecting to ResourceManager at sandbox-hdp.hortonworks.com/172.18.0.2:8030
19/10/19 06:21:35 INFO YarnRMClient: Registering the ApplicationMaster
19/10/19 06:21:35 INFO YarnAllocator: Will request 2 executor container(s), each with 1 core(s) and 1408 MB memory (including 384 MB of overhead)

 

 

7 REPLIES 7

avatar
Expert Contributor

I deleted Sandbox and freshly imported it. It worked. 

avatar
Expert Contributor

Unfortunately, after a while, the same problem occurred again. 

avatar
Master Mentor

@erkansirin78 

 

Can you share the steps you executed? Have a look at this spark-shell

avatar
Expert Contributor

Hi @Shelton 

No specific steps actually. I just open HDP 2.6.5 Sandbox, connect it via ssh then run spark-shell --master yarn 

Alternatively, I tried to start Spark on Zeppelin and examined the logs. ERROR was the same. 

Spark-shell has opened in local mode, no problem. But I can't start it yarn mode. But I managed to start a newly imported Sandbox. 

avatar
Master Mentor

@erkansirin78 

Extract from spark.apache.org.In the Spark shell, a special interpreter-aware SparkContext is already created for you, in the variable called sc. Making your own SparkContext will not work. You can set which master the context connects to using the --master the argument, and you can add JARs to the classpath by passing a comma-separated list to the --jars argument.
I am not a spark expert but trying to understand 

avatar
Expert Contributor

Here are newly imported HDP 2.6.5 Sandbox spark-shell --master yarn

[root@sandbox-hdp ~]# spark-shell --master yarn
SPARK_MAJOR_VERSION is set to 2, using Spark2
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
Spark context Web UI available at <a href="http://sandbox-hdp.hortonworks.com:4040" target="_blank">http://sandbox-hdp.hortonworks.com:4040</a>
Spark context available as 'sc' (master = yarn, app id = application_1572283124735_0001).
Spark session available as 'spark'.
Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 2.3.0.2.6.5.0-292
      /_/

Using Scala version 2.11.8 (OpenJDK 64-Bit Server VM, Java 1.8.0_171)
Type in expressions to have them evaluated.
Type :help for more information.

scala>

avatar
Master Mentor

@erkansirin78 

 

That's exactly the output I was getting on my single node cluster, not Sandbox but I didn't know what you exactly wanted.

When you start getting errors then you can ping me!