Support Questions

Find answers, ask questions, and share your expertise

spark2-shell on yarn not starting

New Contributor

Dear community, 


I have a CDH 5.14 test installation on a pseudo distributed cluster. Everything is fine except Spark2 shells (scala and python). 


The shells are starting up, but after a few lines nothing happens....


WARNING: User-defined SPARK_HOME (/opt/cloudera/parcels/SPARK2-2.2.0.cloudera2-1.cdh5.12.0.p0.232957/lib/spark2) overrides detected (/opt/cloudera/parcels/SPARK2/lib/spark2).
WARNING: Running spark-class from user-defined location.
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
18/07/31 21:27:01 WARN util.Utils: Your hostname, centos.gbdmp resolves to a loopback address:; using instead (on interface wlp3s0)
18/07/31 21:27:01 WARN util.Utils: Set SPARK_LOCAL_IP if you need to bind to another address

As it is working fine in local mode there must be some Yarn configuration issue. Which parameter do I have to set or increase. 

Thanks for a hint. 



Hi, have you got any solution on caption problem? please let me know. I also have same problem.

Thank you




   By default spark-shell is running on root.default queue on YARN. Maybe this queue has not got resources assigned. You can check this in 'YARN Applications' view in Cloudera manager. If you have defined some queues you used before you can pass --queue parameter to spark-shell command.





Take a Tour of the Community
Don't have an account?
Your experience may be limited. Sign in to explore more.