Reply
Highlighted
New Contributor
Posts: 3
Registered: ‎07-18-2017
Accepted Solution

spark-shell stuck

I am trying to run spark-shell from Terminal and it got Stuck. Can somebody help me?

This is what I get. I just started using Spark and I am totally new to it

[root@quickstart cloudera]# spark-shell
WARNING: User-defined SPARK_HOME (/opt/cloudera/parcels/CDH-5.8.0-1.cdh5.8.0.p0.42/lib/spark) overrides detected (/usr/lib/spark).
WARNING: Running spark-class from user-defined location.
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel).
Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 1.6.0
      /_/

Using Scala version 2.10.5 (Java HotSpot(TM) 64-Bit Server VM, Java 1.7.0_67)
Type in expressions to have them evaluated.
Type :help for more information

 

Cloudera Employee
Posts: 16
Registered: ‎11-16-2015

Re: spark-shell stuck

The point where it is stuck likely indicates that the shell can't reach to the RM (Resource Manager) on port 8032 for requesting a new application.

 

$ spark-shell
Welcome to
____ __
/ __/__ ___ _____/ /__
_\ \/ _ \/ _ `/ __/ '_/
/___/ .__/\_,_/_/ /_/\_\ version 1.6.0
/_/

Using Scala version 2.10.5 (Java HotSpot(TM) 64-Bit Server VM, Java 1.7.0_67)
Type :help for more information.
17/07/18 19:48:57 INFO spark.SparkContext: Running Spark version 1.6.0
...
17/07/18 19:48:59 INFO util.Utils: Successfully started service 'SparkUI' on port 4040.
17/07/18 19:48:59 INFO ui.SparkUI: Started SparkUI at http://10.xx.xx.xx:4040
17/07/18 19:48:59 INFO client.RMProxy: Connecting to ResourceManager at host-10-xx-xx-xx.xx.cloudera.com/10.xx.xx.xx:8032

 

Could you confirm from CM (Cloudera Manager UI) that your YARN service (RM and NM) are healthy and working? If they are good, I'd suggest to temporarily change the shell's logging level from WARN to INFO (and if that doesn't help use DEBUG) to see what the shell is up to.

 

The easy way to do this would be to edit /etc/spark/conf/log4j.properties and add (or edit if the below properties already exist)

 

# vi /etc/spark/conf/log4j.properties 
shell.log.level=INFO
log4j.rootCategory=INFO, console

 

$ spark-shell

New Contributor
Posts: 3
Registered: ‎07-18-2017

Re: spark-shell stuck

Here is the Screenshot of Yarn:

screenshot-quickstart.cloudera 7180-2017-07-19-14-31-51.png

And i made the changes in log4j.properties file, but it's still same.

 

Cloudera Employee
Posts: 16
Registered: ‎11-16-2015

Re: spark-shell stuck

Thank you! The screen-shot shows that your Yarn service (which includes Resource Manager, Node Manager) is not UP/Started. Please start them. On the extreme right hand side you will see Actions (Drop Down) > Start . Once the services are started you'd see a green icon next to YARN (MR2 Included) and the Status Summary would show all services as 'Started' (which right now shows stopped).

 

Let us know how it goes. If the services are not starting or having trouble starting up, please share the error message.

 

Good Luck!

New Contributor
Posts: 3
Registered: ‎07-18-2017

Re: spark-shell stuck

Thank you so much for the Response.

All this time I was not using Cloudera Express and I am just getting used to it.

Now the Spark-shell started with no problem.

I would highly Appreciate your Help.

 

Thank you

Pavan

Announcements