Created 04-26-2016 08:20 PM
I installed zeppelin manually on my node(not sandbox) but after following through the instructions on configuring the spark notebook I notice that when I run "sc.version" it throws me an error(below):
sc.version
java.net.ConnectException: Connection refused at java.net.PlainSocketImpl.socketConnect(Native Method) at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:350) at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:206) at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:188) at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392) at java.net.Socket.connect(Socket.java:589) at org.apache.thrift.transport.TSocket.open(TSocket.java:182) at org.apache.zeppelin.interpreter.remote.ClientFactory.create(ClientFactory.java:51) at org.apache.zeppelin.interpreter.remote.ClientFactory.create(ClientFactory.java:37) at org.apache.commons.pool2.BasePooledObjectFactory.makeObject(BasePooledObjectFactory.java:60) at org.apache.commons.pool2.impl.GenericObjectPool.create(GenericObjectPool.java:861) at org.apache.commons.pool2.impl.GenericObjectPool.borrowObject(GenericObjectPool.java:435) at org.apache.commons.pool2.impl.GenericObjectPool.borrowObject(GenericObjectPool.java:363) at org.apache.zeppelin.interpreter.remote.RemoteInterpreterProcess.getClient(RemoteInterpreterProcess.java:142) at org.apache.zeppelin.interpreter.remote.RemoteInterpreter.getFormType(RemoteInterpreter.java:271) at org.apache.zeppelin.interpreter.LazyOpenInterpreter.getFormType(LazyOpenInterpreter.java:104) at org.apache.zeppelin.notebook.Paragraph.jobRun(Paragraph.java:199) at org.apache.zeppelin.scheduler.Job.run(Job.java:171) at org.apache.zeppelin.scheduler.RemoteScheduler$JobRunner.run(RemoteScheduler.java:326) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180) at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at java.lang.Thread.run(Thread.java:745)
Created 04-26-2016 08:44 PM
@Koffi Did you configure the interpreter correctly?
refer to this link for more details
http://hortonworks.com/hadoop-tutorial/apache-zeppelin-hdp-2-4/
Created on 04-26-2016 09:40 PM - edited 08-19-2019 03:43 AM
@Yogeshprabhu thanks very much for reaching back to me, I redid the steps listed in the link above and still experiencing the same errors.I attached photos of my configs. Thanks alot
Created 04-27-2016 01:19 AM
Can you also post a screenshot of your yarn interpreter settings on your zeppelin ?
Created on 04-27-2016 03:20 PM - edited 08-19-2019 03:43 AM
@Yogeshprabhu I attached spark-yarn-client interpreter settings. Thanks
Created 04-26-2016 11:07 PM
Try restarting spark interpreter within Zeppelin, if that doesn't work restart zeppelin. zappelin-daemon.sh restart can do the Zeppelin restart.
Created 04-26-2016 11:12 PM
@vshukla restarted the zeppelin interpreter and zappelin-daemon.sh still geting the same error. Thanks
Created 04-26-2016 11:40 PM
check out your local mac firewall settings, make sure nothing else is running on the same ports. make sure nothing requires root access. also restarting the server will sometimes do the trick.
Created 04-27-2016 04:34 PM
@Timothy Spann nothing is conflicting with firewall settings and there are no services running on port 9995 except zeppelin.Since server in use I'll restart at down-time
Created 04-27-2016 07:24 PM
is spark running in the cluster? is it on the default port.
can you access Spark? can you get to the spark history UI
Created on 04-27-2016 07:49 PM - edited 08-19-2019 03:43 AM
@Timothy Spann Yes spark is running in the cluster and on its default port(never changed the default port).I attached the configuration screen for spark interpreter.I can also access spark from the commandline and also from the UI those work perfectly.Thanks!
Created 04-27-2016 07:59 PM
local master is not using YARN version of Spark. it's running a local version. Is that running?
is the green connected light on in the right upper corner?
Created 04-29-2016 03:22 PM
@Timothy Spann should local master be set to "yarn-client" like it was set in "spark-yarn-interpreter"? The cluster is running spark 1.6 and works perfectly from the command line and yes the green connected light is on on the upper right corner.
Created 09-12-2016 07:51 PM
yes, if you use the Zeppelin now installed with Spark this should be resolved
Created 12-31-2016 10:24 PM
Please try running it in yarn-cluster mode
Created 01-17-2017 10:50 PM
I have the same problem and have tried all suggestions written above and still get the error message. Will appreciate your suggestions. @Koffi @Timothy Spann @Yogeshprabhu
Created 01-18-2017 01:09 AM
are you using out of the box zeppelin installed through ambari, version 0.60?
how much RAM do you have?
what version of HDP? ambari? jdk?
does your cluster have spark?
any logs?
Created 01-06-2017 05:04 PM
This connection error usually means that the interpreter has been failed for some reason.
1. First of all check the log of the interpreter in the logs directory.
2. As you use yarn-client I guess spark has not been configured properly to use spark. Check if you have the right yarn-site.xml and core-site.xml in your $SPARK_CONF_DIR. You should also check if SPARK_HOME and SPARK_CONF_DIR set in your zeppelin-env.sh
3. Usually the spark-submit parameters are visible from the interpreter log, you can also check the log and try to submit an example application from the command line with the same parameters.
4. Sometime the spark-submit works well but the yarn application master is failed for some reason, so you can also check if you have any application on your spark web ui.
Created 01-25-2018 10:13 PM
I had similar stack trace issue, As per suggestion, I checked zeppelin-env.sh, I Noticed SPARK_HOME is commented. I corrected it still we had issue with same error. It would be appreciated if you can provide any more details to solve it. We are using HDP-2.6.1.12 with Ambari 2.5
Created 04-04-2018 02:21 AM
May I know the fix for this error?