- Subscribe to RSS Feed
- Mark Question as New
- Mark Question as Read
- Float this Question for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page
Install two versions of Spark on same cluster.
- Labels:
-
Apache Spark
Created ‎10-11-2016 09:12 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi,
Is it possible to install two version of Spark on the same cluster? I know that in HDP 2.5 it supports Spark 1.6 and Spark 2.0 but I want them on either HDP 2.3.6 to install Spark 1.5 and Spark 1.6.
The reason for above requirement is we are integrating the external tool with Hadoop and Spark, where one tool support only Spark 1.5 and other tool require Spark 1.6.
Any help is highly appricated.
Created ‎10-15-2016 03:32 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Yes. Actually, it is technically possible and even done. HDP 2.5 includes two versions of Spark: 1.6.2 production level and 2.0 technical preview. They co-exist having different timeline server. You can add Spark 2.0 using Ambari UI and "Add Service". In this case, the reason is to provide a preview of Spark 2.0, however, it is a business decision whether it makes sense.
If any of the responses was helpful, don't forget to vote/accept best answer.
Created ‎10-11-2016 09:15 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
@SBandaru from a technical sense you could do it but this would not be supported. If spark 1.6.x is required I recommend upgrading to HDP 2.4.2. Regarding your tool which support older version, I would reach out to vendor and ask them to come up to latest release. Spark has moved on to 2.0. Being on 1.5 well behind the ball in terms of spark.
Created ‎10-15-2016 03:32 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Yes. Actually, it is technically possible and even done. HDP 2.5 includes two versions of Spark: 1.6.2 production level and 2.0 technical preview. They co-exist having different timeline server. You can add Spark 2.0 using Ambari UI and "Add Service". In this case, the reason is to provide a preview of Spark 2.0, however, it is a business decision whether it makes sense.
If any of the responses was helpful, don't forget to vote/accept best answer.
Created ‎06-15-2017 05:30 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I have installed Spark 2.1 manually on HDP 2.3.4 while there is another version Spark 1.5 already installed via HDP. When i try to run jobs in yarn cluster mode spark 2.1 is resolving to HDP 2.3.4 spark libraries and resulting in bad substitution errors. Any ideas how you were able to resolve this when using two spark versions ?
Created ‎10-15-2016 01:02 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I have cluster with two spark versions, one was installed with cluster, one manually for zeppelin and livy server, no issues.
Created ‎06-15-2017 04:52 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I have installed Spark 2.1 manually on HDP 2.3.4 while there is another version Spark 1.5 already installed via HDP. When i try to run jobs in yarn cluster mode spark 2.1 is resolving to HDP 2.3.4 spark libraries and resulting in bad substitution errors. Any ideas how you were able to resolve this when using two spark versions ?
