- Subscribe to RSS Feed
- Mark Question as New
- Mark Question as Read
- Float this Question for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page
Problem removing Spark 1.6 from my cluster
- Labels:
-
Apache Spark
-
Cloudera Manager
Created on ‎06-09-2017 08:33 AM - edited ‎09-16-2022 04:44 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hello,
I want to remove Spark 1.6 in order to Install Spark 2.1. But when I try to remove it with this command:
sudo yum remove spark-core spark-master spark-worker spark-history-server spark-python
source: https://www.cloudera.com/documentation/enterprise/5-8-x/topics/cdh_ig_cdh_comp_uninstall.html
The packages are not found.
What should I do in order to remove Spark 1.6 from my Cluester?.
Also, in a previousstep I deleted it from my services.
I'm using CDH 5.8.0.
Created ‎06-09-2017 01:52 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
@saranvisa Thanks, that worked.
Also, in order to point to the new Spark, I had to change some symbolic links and ENV variables.
export SPARK_DIST_CLASSPATH=$(hadoop classpath) ln -sf /opt/cloudera/parcels/SPARK2-2.1.0.cloudera1-1.cdh5.7.0.p0.120904/lib/spark2/bin/spark-shell /etc/alternatives/spark-shell export SPARK_HOME=/opt/cloudera/parcels/SPARK2-2.1.0.cloudera1-1.cdh5.7.0.p0.120904/lib/spark2
Created ‎06-09-2017 09:06 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Created ‎06-09-2017 10:57 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
To my knowledge, you don't need to remove spark 1.x in order to install spark 2.x
You can have both and CM will have separate services for spark (spark 1.x) and spark2 (spark 2.x).
Note: This is only for spark
Created ‎06-09-2017 11:01 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Got the link now and it says "A Spark 1.6 service can co-exist on the same cluster as Spark 2."
https://www.cloudera.com/documentation/spark2/latest/topics/spark2_requirements.html
Created ‎06-09-2017 01:52 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
@saranvisa Thanks, that worked.
Also, in order to point to the new Spark, I had to change some symbolic links and ENV variables.
export SPARK_DIST_CLASSPATH=$(hadoop classpath) ln -sf /opt/cloudera/parcels/SPARK2-2.1.0.cloudera1-1.cdh5.7.0.p0.120904/lib/spark2/bin/spark-shell /etc/alternatives/spark-shell export SPARK_HOME=/opt/cloudera/parcels/SPARK2-2.1.0.cloudera1-1.cdh5.7.0.p0.120904/lib/spark2
Created on ‎08-09-2017 10:24 PM - edited ‎08-10-2017 02:40 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Where i have to set environment variables? Did i have to change in master or spark services running in slave machine's?
Please help on this!!!
Created ‎08-10-2017 08:20 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
If you want to use spark2-shell and spark2-submit, you don't have to set those ENV variables. I set it because I wanted to point the current spark-shell/submit to spark2.
This should be done in all the nodes that you want to use the shell and/or the submit.
I forgot to add the changes that I made for spark-sumbit.
In these files:
/opt/cloudera/parcels/CDH-5.8.0-1.cdh5.8.0.p0.42/bin/spark-submit /opt/cloudera/parcels/CDH-5.8.0-1.cdh5.8.0.p0.42/lib/spark/bin/spark-submit
Add this ENV var:
SPARK_HOME=/opt/cloudera/parcels/SPARK2-2.1.0.cloudera1-1.cdh5.7.0.p0.120904/lib/spark2
