Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

HDP2.6 / Zeppelin0.7 / How to have both Spark 1.6.3 & spark2 interpreters?

Solved Go to solution
Highlighted

HDP2.6 / Zeppelin0.7 / How to have both Spark 1.6.3 & spark2 interpreters?

Rising Star

Hello,

After upgrading HDP to 2.6 and getting the GA release of Spark 2.1, i'm trying (to no luck so far) to add Spark2 Interpreter in Zeppelin (if possible at all).

I did create a new interpreter spark2 in Zeppelin

15741-spark2.png

which will be instantiate properly (%spark2), however sc.version indicates that i'm still running Spark 1.6.3.

Digging in the config & the doc, I found out that SPARK_HOME is defined in zeppelin-env.sh, pointing by default to Spark 1.6.63.

Editing the config & restart Zeppelin will "work" in the sense that I can now successfully instantiate Spark2, but Spark 1.6.3 is not available anymore from the notebook (and livy is still configured to Spark 1.6.3).

Is there any way to create interpreters to allow using both Spark 1.6.3 & Spark 2 from Zeppelin 0.7 ?

Thanks

Christophe

1 ACCEPTED SOLUTION

Accepted Solutions

Re: HDP2.6 / Zeppelin0.7 / How to have both Spark 1.6.3 & spark2 interpreters?

Expert Contributor
@Christophe Vico

I recommend you download the Sandbox:

https://hortonworks.com/products/sandbox/

From Zeppelin, and in the one notebook, you can run different versions of Spark (1.6.3 or 2.1) as per your choice of interpreter: %spark.spark or %spark2.spark

You can review the settings used in the Interpreter screen.

Regards,

1 REPLY 1

Re: HDP2.6 / Zeppelin0.7 / How to have both Spark 1.6.3 & spark2 interpreters?

Expert Contributor
@Christophe Vico

I recommend you download the Sandbox:

https://hortonworks.com/products/sandbox/

From Zeppelin, and in the one notebook, you can run different versions of Spark (1.6.3 or 2.1) as per your choice of interpreter: %spark.spark or %spark2.spark

You can review the settings used in the Interpreter screen.

Regards,