Support Questions

Find answers, ask questions, and share your expertise
Check out our newest addition to the community, the Cloudera Data Analytics (CDA) group hub.

How to downgrade Spark

New Contributor

We are currently on Cloudera 5.5.2, Spark 1.5.0 and installed the SAP HANA Vora 1.1 service and works well.


The SAP HANA Vora Spark Extensions currently require Spark 1.4.1, so we would like to downgrade Spark from 1.5.0 to 1.4.1. How can we do this? Thanks!


Master Guru
Spark is an inbuilt component of CDH and moves with the CDH version releases. There is no way to downgrade just a single component of CDH as they are built to work together in the versions carried.

Do the extensions not work with 1.5? Are there incompatible changes you're observing (between the claimed support for 1.4.1 vs. 1.5.0) that are causing it to fail?

New Contributor

Hi Harsh,


Thank you. So we should be good by downgrading CDH to a version with Spark 1.4.1 then? 


There are multiple issues between 1.4.1 and 1.5.0: We have been told by the developers that they work on supporting Spark 1.5.0 and advised us to use Spark 1.4.1 in the mean time

Master Guru
CDH 5.4 had Spark 1.3.0 plus patches, which per the blog post seems like it would not work either (it quotes "strong dependency", which I take means ONLY 1.4.1?). CDH 5.5.x onwards carries Spark 1.5.x with patches. There has been no CDH5 release with Spark 1.4.x in it.

You could use a Apache Spark 1.4.1 release from upstream, manually rebuilt against your CDH5 version of Apache Hadoop, and use the tar-ball paths for all Spark operations, and this should work.

However, such a Spark deployment would not be officially supported by Cloudera Support (if you have a subscription).
Take a Tour of the Community
Don't have an account?
Your experience may be limited. Sign in to explore more.