- Subscribe to RSS Feed
- Mark Question as New
- Mark Question as Read
- Float this Question for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page
Issues upgrading Spark from Spark 1.3 -> Spark 1.4
Created on ‎06-25-2015 06:12 PM - edited ‎09-16-2022 02:32 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hello,
I have Spark 1.2 running on CDH 5.3.1 and would like to upgrade to Spark 1.3
1) According to these two documentations for 5.4 and 5.0.x
The command is:
sudo yum install spark-core spark-master spark-worker spark-history-server spark-python
However, it is saying there is no spark-core, etc. avaialable
2) Do I have to upgrade CDH to run Spark 1.3?
Thanks!
Created ‎06-25-2015 11:36 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
These are instructions for installing via packages, which is not the usual way to do it. Do you really intend this?
If so have you set up the Cloudera repos?
Generally you manage using parcels, and yes updating Spark means updating CDH, since you're talking about updating many other harmonized dependencies.
Created ‎06-29-2015 12:43 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
We do not support upgrading Spark without upgrading the rest of CDH.
Spark is compiled against a version of Hadoop and the versions of Hadoop can change between releases of CDH.
You also need to take into account the dependencies of Spark (like Hive) which might change between versions.
Even if you would be able to upgrade the package you might get weird failures due to the dependency breakage.
Wilfred
Created ‎06-25-2015 11:36 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
These are instructions for installing via packages, which is not the usual way to do it. Do you really intend this?
If so have you set up the Cloudera repos?
Generally you manage using parcels, and yes updating Spark means updating CDH, since you're talking about updating many other harmonized dependencies.
Created ‎06-29-2015 07:02 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Thank you Sean for the answer, I actually misspoke and just need to upgrade to Spark 1.3 (I'm using Spark 1.2).
I've been trying to use this guide:
https://s3.amazonaws.com/quickstart-reference/cloudera/hadoop/latest/doc/Cloudera_EDH_on_AWS.pdf
But I am still only getting Spark 1.2, do you have any suggestions on how I can use this guide to get Spark 1.3?
Created ‎06-29-2015 12:43 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
We do not support upgrading Spark without upgrading the rest of CDH.
Spark is compiled against a version of Hadoop and the versions of Hadoop can change between releases of CDH.
You also need to take into account the dependencies of Spark (like Hive) which might change between versions.
Even if you would be able to upgrade the package you might get weird failures due to the dependency breakage.
Wilfred
