Support Questions
Find answers, ask questions, and share your expertise

Re: Configuring Zeppelin Spark Interpreters

Contributor

@Timothy Spann Yes spark is running in the cluster and on its default port(never changed the default port).I attached the configuration screen for spark interpreter.I can also access spark from the commandline and also from the UI those work perfectly.Thanks!

3752-spark-interpreter-screenshot.jpg

Re: Configuring Zeppelin Spark Interpreters

Super Guru

local master is not using YARN version of Spark. it's running a local version. Is that running?

is the green connected light on in the right upper corner?

Re: Configuring Zeppelin Spark Interpreters

Contributor

@Timothy Spann should local master be set to "yarn-client" like it was set in "spark-yarn-interpreter"? The cluster is running spark 1.6 and works perfectly from the command line and yes the green connected light is on on the upper right corner.

Re: Configuring Zeppelin Spark Interpreters

Super Guru

yes, if you use the Zeppelin now installed with Spark this should be resolved

Re: Configuring Zeppelin Spark Interpreters

Contributor
@Koffi

Please try running it in yarn-cluster mode

Re: Configuring Zeppelin Spark Interpreters

New Contributor

I have the same problem and have tried all suggestions written above and still get the error message. Will appreciate your suggestions. @Koffi @Timothy Spann @Yogeshprabhu

Re: Configuring Zeppelin Spark Interpreters

Super Guru

are you using out of the box zeppelin installed through ambari, version 0.60?

how much RAM do you have?

what version of HDP? ambari? jdk?

does your cluster have spark?

any logs?

Re: Configuring Zeppelin Spark Interpreters

Cloudera Employee

This connection error usually means that the interpreter has been failed for some reason.

1. First of all check the log of the interpreter in the logs directory.

2. As you use yarn-client I guess spark has not been configured properly to use spark. Check if you have the right yarn-site.xml and core-site.xml in your $SPARK_CONF_DIR. You should also check if SPARK_HOME and SPARK_CONF_DIR set in your zeppelin-env.sh

3. Usually the spark-submit parameters are visible from the interpreter log, you can also check the log and try to submit an example application from the command line with the same parameters.

4. Sometime the spark-submit works well but the yarn application master is failed for some reason, so you can also check if you have any application on your spark web ui.

Re: Configuring Zeppelin Spark Interpreters

New Contributor

@melek

I had similar stack trace issue, As per suggestion, I checked zeppelin-env.sh, I Noticed SPARK_HOME is commented. I corrected it still we had issue with same error. It would be appreciated if you can provide any more details to solve it. We are using HDP-2.6.1.12 with Ambari 2.5

Re: Configuring Zeppelin Spark Interpreters

Explorer

May I know the fix for this error?