Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

Can I upgrade Apache Spark when I'm using packages?

Solved Go to solution

Can I upgrade Apache Spark when I'm using packages?

Explorer

Hi Right now I'm using Parcels, but if I wanted to upgrade to pacakges (or migrate to packages rather) then could I upgrade to Apache Spark 2.0.2?  Could I still use Cloudera Maanger to control it?

 

 

 

1 ACCEPTED SOLUTION

Accepted Solutions

Re: Can I upgrade Apache Spark when I'm using packages?

Expert Contributor

Spark 2.0 is available as a parcel as well, so you shouldn't need to move to packages unless you have another reason.  Spark 2.0 is out of beta now and is GA.  Here is more information on how to install Spark 2 with Cloudera Manager: http://www.cloudera.com/documentation/spark2/latest/topics/spark2_installing.html

5 REPLIES 5

Re: Can I upgrade Apache Spark when I'm using packages?

Explorer

There isn't even a working link that takes me to the "Spark 2.0 Beta" either. http://www.cloudera.com/downloads/beta/spark2/2-0-0.html

Re: Can I upgrade Apache Spark when I'm using packages?

Expert Contributor

Spark 2.0 is available as a parcel as well, so you shouldn't need to move to packages unless you have another reason.  Spark 2.0 is out of beta now and is GA.  Here is more information on how to install Spark 2 with Cloudera Manager: http://www.cloudera.com/documentation/spark2/latest/topics/spark2_installing.html

Re: Can I upgrade Apache Spark when I'm using packages?

Explorer

Hey man thanks for link but that's for CDH 5.7.  Can you tell me if that will have a problem with CDH 5.9?  I don't have CDH 5.7, and I'm not going to try and revert to an old version. 

Highlighted

Re: Can I upgrade Apache Spark when I'm using packages?

Explorer

 

I installed my cluster (CDH 5.10.0) using packages. I would like to try Spark2, but noticed it is currently only available as a parcel. I can't find any instructions for my situation. So my question is: is it possible to install the Spark2 parcel on my cluster, even though my cluster was originally installed by using packages?

 

Re: Can I upgrade Apache Spark when I'm using packages?

Expert Contributor
No problem. The name is a bit misleading, but 5.7 is the minimum version required, installing that parcel won't be a problem with 5.9. the requirements section[1] has a bit more information on supported versions.

1. http://www.cloudera.com/documentation/spark2/latest/topics/spark2_requirements.html
Don't have an account?
Coming from Hortonworks? Activate your account here