Reply
Explorer
Posts: 8
Registered: ‎06-29-2017

Reinstalling spark

[ Edited ]

Hi,

I had problems with Spark and my solution was reinstalling the service but it results a bad idea, It did the following steps:

 

Ensuring that the expected software releases are installed on hosts. > OK

Successfully deployed all client configurations. > OK

Finished waiting > OK

Start Zookeeper > OK

Start HDFS > OK

Start YARN > ERROR:

 

 

com.cloudera.cmf.service.config.ConfigGenException: Conflicting yarn extensions provided by more than one service. Yarn extension key [spark_shuffle], First value [YarnAuxServiceExtension{className='org.apache.spark.network.yarn.YarnShuffleService', auxServiceId='spark_shuffle', configs={spark.shuffle.service.port=7337, spark.authenticate=true}, service=Spark}], Second value [YarnAuxServiceExtension{className='org.apache.spark.network.yarn.YarnShuffleService', auxServiceId='spark_shuffle', configs={spark.shuffle.service.port=7337, spark.authenticate=false}, service=Spark}].

 

However cloudera manager shows Spark as OK but spark-shell stop when try to initiate spark context:

 

Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel).
Welcome to
____ __
/ __/__ ___ _____/ /__
_\ \/ _ \/ _ `/ __/ '_/
/___/ .__/\_,_/_/ /_/\_\ version 1.6.0
/_/

Using Scala version 2.10.5 (Java HotSpot(TM) 64-Bit Server VM, Java 1.7.0_67)
Type in expressions to have them evaluated.
Type :help for more information.

Highlighted
Champion
Posts: 600
Registered: ‎05-16-2016

Re: Reinstalling spark

did you remove the spark service nice and clean without any issue before re-install ? 

could you provide us the logs on spark service 

Announcements