Reply
New Contributor
Posts: 1
Registered: ‎07-31-2018

spark2-shell on yarn not starting

Dear community, 

 

I have a CDH 5.14 test installation on a pseudo distributed cluster. Everything is fine except Spark2 shells (scala and python). 

 

The shells are starting up, but after a few lines nothing happens....

 

spark2-shell
WARNING: User-defined SPARK_HOME (/opt/cloudera/parcels/SPARK2-2.2.0.cloudera2-1.cdh5.12.0.p0.232957/lib/spark2) overrides detected (/opt/cloudera/parcels/SPARK2/lib/spark2).
WARNING: Running spark-class from user-defined location.
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
18/07/31 21:27:01 WARN util.Utils: Your hostname, centos.gbdmp resolves to a loopback address: 127.0.0.1; using 192.168.2.106 instead (on interface wlp3s0)
18/07/31 21:27:01 WARN util.Utils: Set SPARK_LOCAL_IP if you need to bind to another address

As it is working fine in local mode there must be some Yarn configuration issue. Which parameter do I have to set or increase. 

Thanks for a hint. 

Cheers

Highlighted
New Contributor
Posts: 4
Registered: ‎10-21-2018

Re: spark2-shell on yarn not starting

Hi, have you got any solution on caption problem? please let me know. I also have same problem.

Thank you
Explorer
Posts: 12
Registered: ‎11-05-2018

Re: spark2-shell on yarn not starting

Hi,

 

   By default spark-shell is running on root.default queue on YARN. Maybe this queue has not got resources assigned. You can check this in 'YARN Applications' view in Cloudera manager. If you have defined some queues you used before you can pass --queue parameter to spark-shell command.

 

Regards,

 

Bart

Announcements