- Subscribe to RSS Feed
- Mark Question as New
- Mark Question as Read
- Float this Question for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page
[SOLVED] [Zeppelin] Job failed: Implementing class
- Labels:
-
Apache Spark
-
Apache Zeppelin
Created ‎12-07-2016 11:47 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
On HDP 2.4 I've installed Zeppelin 0.6.1 with Spark interpreter built with Scala 2.10. (Spark version is 1.6.1.)
All interpreters work well but the Spark interpreter fails. The error in log message is:
INFO [2016-12-05 13:25:35,638] ({pool-2-thread-4} SchedulerFactory.java[jobStarted]:131) - Job remoteInterpretJob_1480965935638 started by scheduler org.apache.zeppelin.spark.SparkInterpreter1640235141 ERROR [2016-12-05 13:25:35,650] ({pool-2-thread-4} Job.java[run]:189) - Job failed java.lang.IncompatibleClassChangeError: Implementing class at java.lang.ClassLoader.defineClass1(Native Method) at java.lang.ClassLoader.defineClass(ClassLoader.java:760) at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142) at java.net.URLClassLoader.defineClass(URLClassLoader.java:467) at java.net.URLClassLoader.access$100(URLClassLoader.java:73) at java.net.URLClassLoader$1.run(URLClassLoader.java:368) at java.net.URLClassLoader$1.run(URLClassLoader.java:362) at java.security.AccessController.doPrivileged(Native Method) at java.net.URLClassLoader.findClass(URLClassLoader.java:361) at java.lang.ClassLoader.loadClass(ClassLoader.java:424) at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331) at java.lang.ClassLoader.loadClass(ClassLoader.java:357) at java.lang.Class.forName0(Native Method) at java.lang.Class.forName(Class.java:264) at org.apache.zeppelin.spark.Utils.isScala2_10(Utils.java:88) at org.apache.zeppelin.spark.SparkInterpreter.open(SparkInterpreter.java:570) at org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(LazyOpenInterpreter.java:69) at org.apache.zeppelin.interpreter.LazyOpenInterpreter.interpret(LazyOpenInterpreter.java:93) at org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:341) at org.apache.zeppelin.scheduler.Job.run(Job.java:176) at org.apache.zeppelin.scheduler.FIFOScheduler$1.run(FIFOScheduler.java:139) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180) at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at java.lang.Thread.run(Thread.java:745) INFO [2016-12-05 13:25:35,651] ({pool-2-thread-4} SchedulerFactory.java[jobFinished]:137) - Job remoteInterpretJob_1480965935638 finished by scheduler org.apache.zeppelin.spark.SparkInterpreter1640235141
In zeppelin-env.sh file the environment variables are
export MASTER=yarn-client export HADOOP_CONF_DIR="/etc/hadoop/conf" export ZEPPELIN_JAVA_OPTS="-Dhdp.version=2.4.2.0-258 -Dspark.yarn.queue=default" export SPARK_HOME="/usr/hdp/current/spark-client" export PYTHONPATH="${SPARK_HOME}/python:${SPARK_HOME}/python/lib/py4j-0.8.2.1-src.zip" export SPARK_YARN_USER_ENV="PYTHONPATH=${PYTHONPATH}"
Created ‎12-11-2016 11:07 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
The problem was a previous Zeppelin installation from Ambari (v0.6.0) that was in maintained mode but wasn't uninstalled. So, when Zeppelin v0.6.1 starts up, it loads a variable environment called CLASSPATH with a wrong classpath (because I uses Spark 2.11).
I solved it adding this line at top in file ${HOME}/zeppelin-0.6.1/bin/common.sh
unset CLASSPATH
Created ‎12-11-2016 04:46 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Zeppelin 0.6.1 has serveral critical bugs for spark interpreter, please try zeppelin-0.6.2
Created ‎12-11-2016 11:07 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
The problem was a previous Zeppelin installation from Ambari (v0.6.0) that was in maintained mode but wasn't uninstalled. So, when Zeppelin v0.6.1 starts up, it loads a variable environment called CLASSPATH with a wrong classpath (because I uses Spark 2.11).
I solved it adding this line at top in file ${HOME}/zeppelin-0.6.1/bin/common.sh
unset CLASSPATH
