Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

Problem removing Spark 1.6 from my cluster

Solved Go to solution
Highlighted

Problem removing Spark 1.6 from my cluster

Contributor

Hello,

 

I want to remove Spark 1.6 in order to Install Spark 2.1. But when I try to remove it with this command:

sudo yum remove spark-core spark-master spark-worker spark-history-server spark-python

source: https://www.cloudera.com/documentation/enterprise/5-8-x/topics/cdh_ig_cdh_comp_uninstall.html

 

The packages are not found.

 

What should I do in order to remove Spark 1.6 from my Cluester?.

 

Also, in a previousstep I deleted it from my services.

 

I'm using CDH 5.8.0.

 

1 ACCEPTED SOLUTION

Accepted Solutions

Re: Problem removing Spark 1.6 from my cluster

Contributor

@saranvisa Thanks, that worked.

Also, in order to point to the new Spark, I had to change some symbolic links and ENV variables.

 

export SPARK_DIST_CLASSPATH=$(hadoop classpath)
ln -sf /opt/cloudera/parcels/SPARK2-2.1.0.cloudera1-1.cdh5.7.0.p0.120904/lib/spark2/bin/spark-shell /etc/alternatives/spark-shell
export SPARK_HOME=/opt/cloudera/parcels/SPARK2-2.1.0.cloudera1-1.cdh5.7.0.p0.120904/lib/spark2
6 REPLIES 6

Re: Problem removing Spark 1.6 from my cluster

Contributor
I forgot to mention that Spark 1.6 came with CDH 5.8. I don't know how CDH installed Spark.

Re: Problem removing Spark 1.6 from my cluster

Champion

@JoaquinS

 

To my knowledge, you don't need to remove spark 1.x in order to install spark 2.x

 

You can have both and CM will have separate services for spark (spark 1.x) and spark2 (spark 2.x).

 

Note: This is only for spark

Re: Problem removing Spark 1.6 from my cluster

Champion

@JoaquinS

 

Got the link now and it says "A Spark 1.6 service can co-exist on the same cluster as Spark 2."

 

https://www.cloudera.com/documentation/spark2/latest/topics/spark2_requirements.html

 

 

 

Re: Problem removing Spark 1.6 from my cluster

Contributor

@saranvisa Thanks, that worked.

Also, in order to point to the new Spark, I had to change some symbolic links and ENV variables.

 

export SPARK_DIST_CLASSPATH=$(hadoop classpath)
ln -sf /opt/cloudera/parcels/SPARK2-2.1.0.cloudera1-1.cdh5.7.0.p0.120904/lib/spark2/bin/spark-shell /etc/alternatives/spark-shell
export SPARK_HOME=/opt/cloudera/parcels/SPARK2-2.1.0.cloudera1-1.cdh5.7.0.p0.120904/lib/spark2

Re: Problem removing Spark 1.6 from my cluster

New Contributor

Where i have to set environment variables? Did i have to change in master or spark services running in slave machine's?

 

Please help on this!!!

Re: Problem removing Spark 1.6 from my cluster

Contributor

If you want to use spark2-shell and spark2-submit, you don't have to set those ENV variables. I set it because I wanted to point the current spark-shell/submit to spark2.

 

This should be done in all the nodes that you want to use the shell and/or the submit.

 

I forgot to add the changes that I made for spark-sumbit.

In these files:

 

/opt/cloudera/parcels/CDH-5.8.0-1.cdh5.8.0.p0.42/bin/spark-submit
/opt/cloudera/parcels/CDH-5.8.0-1.cdh5.8.0.p0.42/lib/spark/bin/spark-submit

Add this ENV var:

SPARK_HOME=/opt/cloudera/parcels/SPARK2-2.1.0.cloudera1-1.cdh5.7.0.p0.120904/lib/spark2
Don't have an account?
Coming from Hortonworks? Activate your account here