Reply
New Contributor
Posts: 1
Registered: ‎07-31-2018

spark2-shell on yarn not starting

Dear community, 

 

I have a CDH 5.14 test installation on a pseudo distributed cluster. Everything is fine except Spark2 shells (scala and python). 

 

The shells are starting up, but after a few lines nothing happens....

 

spark2-shell
WARNING: User-defined SPARK_HOME (/opt/cloudera/parcels/SPARK2-2.2.0.cloudera2-1.cdh5.12.0.p0.232957/lib/spark2) overrides detected (/opt/cloudera/parcels/SPARK2/lib/spark2).
WARNING: Running spark-class from user-defined location.
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
18/07/31 21:27:01 WARN util.Utils: Your hostname, centos.gbdmp resolves to a loopback address: 127.0.0.1; using 192.168.2.106 instead (on interface wlp3s0)
18/07/31 21:27:01 WARN util.Utils: Set SPARK_LOCAL_IP if you need to bind to another address

As it is working fine in local mode there must be some Yarn configuration issue. Which parameter do I have to set or increase. 

Thanks for a hint. 

Cheers

Announcements