Support Questions

Find answers, ask questions, and share your expertise

[SOLVED] [Zeppelin] Job failed: Implementing class

avatar
Contributor

On HDP 2.4 I've installed Zeppelin 0.6.1 with Spark interpreter built with Scala 2.10. (Spark version is 1.6.1.)

All interpreters work well but the Spark interpreter fails. The error in log message is:

INFO [2016-12-05 13:25:35,638] ({pool-2-thread-4} SchedulerFactory.java[jobStarted]:131) - Job remoteInterpretJob_1480965935638 started by scheduler org.apache.zeppelin.spark.SparkInterpreter1640235141
ERROR [2016-12-05 13:25:35,650] ({pool-2-thread-4} Job.java[run]:189) - Job failed
java.lang.IncompatibleClassChangeError: Implementing class
        at java.lang.ClassLoader.defineClass1(Native Method)
        at java.lang.ClassLoader.defineClass(ClassLoader.java:760)
        at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
        at java.net.URLClassLoader.defineClass(URLClassLoader.java:467)
        at java.net.URLClassLoader.access$100(URLClassLoader.java:73)
        at java.net.URLClassLoader$1.run(URLClassLoader.java:368)
        at java.net.URLClassLoader$1.run(URLClassLoader.java:362)
        at java.security.AccessController.doPrivileged(Native Method)
        at java.net.URLClassLoader.findClass(URLClassLoader.java:361)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
        at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
        at java.lang.Class.forName0(Native Method)
        at java.lang.Class.forName(Class.java:264)
        at org.apache.zeppelin.spark.Utils.isScala2_10(Utils.java:88)
        at org.apache.zeppelin.spark.SparkInterpreter.open(SparkInterpreter.java:570)
        at org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(LazyOpenInterpreter.java:69)
        at org.apache.zeppelin.interpreter.LazyOpenInterpreter.interpret(LazyOpenInterpreter.java:93)
        at org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:341)
        at org.apache.zeppelin.scheduler.Job.run(Job.java:176)
        at org.apache.zeppelin.scheduler.FIFOScheduler$1.run(FIFOScheduler.java:139)
        at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)
        at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
        at java.lang.Thread.run(Thread.java:745)
 INFO [2016-12-05 13:25:35,651] ({pool-2-thread-4} SchedulerFactory.java[jobFinished]:137) - Job remoteInterpretJob_1480965935638 finished by scheduler org.apache.zeppelin.spark.SparkInterpreter1640235141

In zeppelin-env.sh file the environment variables are

export MASTER=yarn-client
export HADOOP_CONF_DIR="/etc/hadoop/conf"
export ZEPPELIN_JAVA_OPTS="-Dhdp.version=2.4.2.0-258 -Dspark.yarn.queue=default"
export SPARK_HOME="/usr/hdp/current/spark-client"
export PYTHONPATH="${SPARK_HOME}/python:${SPARK_HOME}/python/lib/py4j-0.8.2.1-src.zip"
export SPARK_YARN_USER_ENV="PYTHONPATH=${PYTHONPATH}"
Do you have any idea on how to correct this error? Thanks in advance.
1 ACCEPTED SOLUTION

avatar
Contributor

The problem was a previous Zeppelin installation from Ambari (v0.6.0) that was in maintained mode but wasn't uninstalled. So, when Zeppelin v0.6.1 starts up, it loads a variable environment called CLASSPATH with a wrong classpath (because I uses Spark 2.11).

I solved it adding this line at top in file ${HOME}/zeppelin-0.6.1/bin/common.sh

unset CLASSPATH

View solution in original post

2 REPLIES 2

avatar
Super Collaborator

Zeppelin 0.6.1 has serveral critical bugs for spark interpreter, please try zeppelin-0.6.2

avatar
Contributor

The problem was a previous Zeppelin installation from Ambari (v0.6.0) that was in maintained mode but wasn't uninstalled. So, when Zeppelin v0.6.1 starts up, it loads a variable environment called CLASSPATH with a wrong classpath (because I uses Spark 2.11).

I solved it adding this line at top in file ${HOME}/zeppelin-0.6.1/bin/common.sh

unset CLASSPATH