Reply
Highlighted
Explorer
Posts: 77
Registered: ‎11-12-2015
Accepted Solution

Problem removing Spark 1.6 from my cluster

Hello,

 

I want to remove Spark 1.6 in order to Install Spark 2.1. But when I try to remove it with this command:

sudo yum remove spark-core spark-master spark-worker spark-history-server spark-python

source: https://www.cloudera.com/documentation/enterprise/5-8-x/topics/cdh_ig_cdh_comp_uninstall.html

 

The packages are not found.

 

What should I do in order to remove Spark 1.6 from my Cluester?.

 

Also, in a previousstep I deleted it from my services.

 

I'm using CDH 5.8.0.

 

Explorer
Posts: 77
Registered: ‎11-12-2015

Re: Problem removing Spark 1.6 from my cluster

I forgot to mention that Spark 1.6 came with CDH 5.8. I don't know how CDH installed Spark.
Posts: 291
Topics: 11
Kudos: 43
Solutions: 25
Registered: ‎09-02-2016

Re: Problem removing Spark 1.6 from my cluster

@JoaquinS

 

To my knowledge, you don't need to remove spark 1.x in order to install spark 2.x

 

You can have both and CM will have separate services for spark (spark 1.x) and spark2 (spark 2.x).

 

Note: This is only for spark

Posts: 291
Topics: 11
Kudos: 43
Solutions: 25
Registered: ‎09-02-2016

Re: Problem removing Spark 1.6 from my cluster

@JoaquinS

 

Got the link now and it says "A Spark 1.6 service can co-exist on the same cluster as Spark 2."

 

https://www.cloudera.com/documentation/spark2/latest/topics/spark2_requirements.html

 

 

 

Explorer
Posts: 77
Registered: ‎11-12-2015

Re: Problem removing Spark 1.6 from my cluster

@saranvisa Thanks, that worked.

Also, in order to point to the new Spark, I had to change some symbolic links and ENV variables.

 

export SPARK_DIST_CLASSPATH=$(hadoop classpath)
ln -sf /opt/cloudera/parcels/SPARK2-2.1.0.cloudera1-1.cdh5.7.0.p0.120904/lib/spark2/bin/spark-shell /etc/alternatives/spark-shell
export SPARK_HOME=/opt/cloudera/parcels/SPARK2-2.1.0.cloudera1-1.cdh5.7.0.p0.120904/lib/spark2
Announcements