Archives of Support Questions (Read Only)

This is an archived board for historical reference. Information and links may no longer be available or relevant
Announcements
This board is archived and read-only for historical reference. To ask a new question, please post a new topic on the appropriate active board.

Differnt Spark versions on HDP 2.4.0.0-169

avatar
Expert Contributor

Hello,

in this HDP version Spark 1.6.0.2.4 was installed during cluster installation. Now we want to play with SAP VORA 1.2 bad sadly will work with Spark 1.5.1 or 1.5.2 only. I believe I have 2 options:

1) Deinstall Spark 1.6 and install the 1.5.2 version. Problem: from where can I download a package ready to use for Amabri

2) Installing it manually in a different directory (e.g. /usr/apache/spark-1.5.2) from http://d3kbcqa49mib13.cloudfront.net/spark-1.5.2-bin-hadoop2.4.tgz. Problem: How to setup a 1.5.2 environment with no effects on the installed 1.6 version.

🙂 Klaus

1 ACCEPTED SOLUTION

avatar
Master Mentor

You can't install Spark 1.5.2 on HDP 2.4.x unless you build it from source and deploy it yourself. you can get the sides from Apache Spark webpage. as long as you keep your Spark init scripts in a non-hdp specific directory you should be good. make sure /usr/bin/spark does not collide with HDP spark. It's best if you wait for SAP Bods 1.3 which I believe will support spark 1.6. in the next major HDP release will be able to run two independent versions of spark simultaneously (1.6.2 and 2.0 technical preview).

View solution in original post

2 REPLIES 2

avatar
Master Mentor

You can't install Spark 1.5.2 on HDP 2.4.x unless you build it from source and deploy it yourself. you can get the sides from Apache Spark webpage. as long as you keep your Spark init scripts in a non-hdp specific directory you should be good. make sure /usr/bin/spark does not collide with HDP spark. It's best if you wait for SAP Bods 1.3 which I believe will support spark 1.6. in the next major HDP release will be able to run two independent versions of spark simultaneously (1.6.2 and 2.0 technical preview).

avatar
Expert Contributor

Hello Artem,

many thanks for your answer.

🙂 Klaus