- Subscribe to RSS Feed
- Mark Question as New
- Mark Question as Read
- Float this Question for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page
spark-shell --master yarn error in HDP 2.6.5
Created on
‎10-19-2019
01:14 AM
- last edited on
‎10-19-2019
05:14 AM
by
cjervis
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
In HDP 2.6.5 I tried
[root@sandbox-hdp ~]# spark-shell --master yarn
But got following errors. It opens scala shell but cannot create spark session.
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
19/10/19 06:04:17 ERROR YarnClientSchedulerBackend: Yarn application has already exited with state FAILED!
19/10/19 06:04:17 ERROR TransportClient: Failed to send RPC 5080948039683175202 to /172.18.0.2:52542: java.nio.channels.ClosedChannelException
java.nio.channels.ClosedChannelException
at io.netty.channel.AbstractChannel$AbstractUnsafe.write(...)(Unknown Source)
19/10/19 06:04:17 ERROR YarnSchedulerBackend$YarnSchedulerEndpoint: Sending RequestExecutors(0,0,Map(),Set()) to AM was unsuccessful
java.io.IOException: Failed to send RPC 5080948039683175202 to /172.18.0.2:52542: java.nio.channels.ClosedChannelException
....
....
Caused by: java.nio.channels.ClosedChannelException
...
19/10/19 06:04:17 ERROR Utils: Uncaught exception in thread Yarn application state monitor
org.apache.spark.SparkException: Exception thrown in awaitResult:
...
Caused by: java.io.IOException: Failed to send RPC 5080948039683175202 to /172.18.0.2:52542: java.nio.channels.ClosedChannelException
...
19/10/19 06:04:17 ERROR SparkContext: Error initializing SparkContext.
java.lang.IllegalStateException: Spark context stopped while waiting for backend
...
console>:14: error: not found: value spark
import spark.implicits._
^
<console>:14: error: not found: value spark
import spark.sql
Then I looked at yarn logs:
19/10/19 06:21:35 INFO RMProxy: Connecting to ResourceManager at sandbox-hdp.hortonworks.com/172.18.0.2:8030
19/10/19 06:21:35 INFO YarnRMClient: Registering the ApplicationMaster
19/10/19 06:21:35 INFO YarnAllocator: Will request 2 executor container(s), each with 1 core(s) and 1408 MB memory (including 384 MB of overhead)
....
19/10/19 06:21:35 INFO RMProxy: Connecting to ResourceManager at sandbox-hdp.hortonworks.com/172.18.0.2:8030
19/10/19 06:21:35 INFO YarnRMClient: Registering the ApplicationMaster
19/10/19 06:21:35 INFO YarnAllocator: Will request 2 executor container(s), each with 1 core(s) and 1408 MB memory (including 384 MB of overhead)
Created ‎10-19-2019 02:45 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I deleted Sandbox and freshly imported it. It worked.
Created ‎10-26-2019 10:42 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Unfortunately, after a while, the same problem occurred again.
Created on ‎10-27-2019 03:11 AM - edited ‎10-27-2019 03:13 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Created on ‎10-27-2019 05:21 AM - edited ‎10-27-2019 05:23 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi @Shelton
No specific steps actually. I just open HDP 2.6.5 Sandbox, connect it via ssh then run spark-shell --master yarn
Alternatively, I tried to start Spark on Zeppelin and examined the logs. ERROR was the same.
Spark-shell has opened in local mode, no problem. But I can't start it yarn mode. But I managed to start a newly imported Sandbox.
Created ‎10-27-2019 06:34 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Extract from spark.apache.org.In the Spark shell, a special interpreter-aware SparkContext is already created for you, in the variable called sc. Making your own SparkContext will not work. You can set which master the context connects to using the --master the argument, and you can add JARs to the classpath by passing a comma-separated list to the --jars argument.
I am not a spark expert but trying to understand
Created ‎10-28-2019 06:45 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Here are newly imported HDP 2.6.5 Sandbox spark-shell --master yarn
[root@sandbox-hdp ~]# spark-shell --master yarn
SPARK_MAJOR_VERSION is set to 2, using Spark2
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
Spark context Web UI available at <a href="http://sandbox-hdp.hortonworks.com:4040" target="_blank">http://sandbox-hdp.hortonworks.com:4040</a>
Spark context available as 'sc' (master = yarn, app id = application_1572283124735_0001).
Spark session available as 'spark'.
Welcome to
____ __
/ __/__ ___ _____/ /__
_\ \/ _ \/ _ `/ __/ '_/
/___/ .__/\_,_/_/ /_/\_\ version 2.3.0.2.6.5.0-292
/_/
Using Scala version 2.11.8 (OpenJDK 64-Bit Server VM, Java 1.8.0_171)
Type in expressions to have them evaluated.
Type :help for more information.
scala>
Created ‎10-28-2019 10:04 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
That's exactly the output I was getting on my single node cluster, not Sandbox but I didn't know what you exactly wanted.
When you start getting errors then you can ping me!
