Support Questions

Find answers, ask questions, and share your expertise

spark: Exception in thread "main" java.lang.NoClassDefFoundError: org/slf4j/impl/StaticLoggerBinder

avatar
New Contributor

Hello,

 

I am under Ubuntu 14.04 64bit using cloudera repository for CDH 5.

I installed hadoop YARN on 1 node (with MangoDb + Mango-hadoop connector).

I try to install Spark using apt but I don't manage to make it work

 

Is there someone who knows what's happening and can tell me how to fix that ?

 

sudo apt-get install spark-core spark-master spark-worker spark-history-server spark-python

 

but got:

Setting up spark-master (1.0.0+cdh5.1.3+45-1.cdh5.1.3.p0.10~precise-cdh5.1.3) ...
 * Starting Spark master (spark-master):
invoke-rc.d: initscript spark-master, action "start" failed.
dpkg: error processing package spark-master (--configure):
 subprocess installed post-installation script returned error exit status 1
Setting up spark-worker (1.0.0+cdh5.1.3+45-1.cdh5.1.3.p0.10~precise-cdh5.1.3) ...
 * Starting Spark worker (spark-worker):
invoke-rc.d: initscript spark-worker, action "start" failed.
dpkg: error processing package spark-worker (--configure):
 subprocess installed post-installation script returned error exit status 1
Errors were encountered while processing:
 spark-master
 spark-worker
E: Sub-process /usr/bin/dpkg returned an error code (1)

 

/var/log/spark/spark-worker.out contains:

Exception in thread "main" java.lang.NoClassDefFoundError: org/slf4j/impl/StaticLoggerBinder
    at org.apache.spark.Logging$class.initializeLogging(Logging.scala:114)
    at org.apache.spark.Logging$class.initializeIfNecessary(Logging.scala:106)
    at org.apache.spark.Logging$class.log(Logging.scala:45)
    at org.apache.spark.util.Utils$.log(Utils.scala:49)
    at org.apache.spark.Logging$class.logWarning(Logging.scala:70)
    at org.apache.spark.util.Utils$.logWarning(Utils.scala:49)
    at org.apache.spark.util.Utils$$anonfun$findLocalIpAddress$1$$anonfun$apply$2.apply(Utils.scala:476)
    at org.apache.spark.util.Utils$$anonfun$findLocalIpAddress$1$$anonfun$apply$2.apply(Utils.scala:473)
    at scala.collection.Iterator$class.foreach(Iterator.scala:727)
    at scala.collection.AbstractIterator.foreach(Iterator.scala:1157)
    at org.apache.spark.util.Utils$$anonfun$findLocalIpAddress$1.apply(Utils.scala:473)
    at org.apache.spark.util.Utils$$anonfun$findLocalIpAddress$1.apply(Utils.scala:472)
    at scala.collection.Iterator$class.foreach(Iterator.scala:727)
    at scala.collection.AbstractIterator.foreach(Iterator.scala:1157)
    at org.apache.spark.util.Utils$.findLocalIpAddress(Utils.scala:472)
    at org.apache.spark.util.Utils$.localIpAddress$lzycompute(Utils.scala:460)
    at org.apache.spark.util.Utils$.localIpAddress(Utils.scala:460)
    at org.apache.spark.util.Utils$.localIpAddressHostname$lzycompute(Utils.scala:461)
    at org.apache.spark.util.Utils$.localIpAddressHostname(Utils.scala:461)
    at org.apache.spark.util.Utils$$anonfun$localHostName$1.apply(Utils.scala:508)
    at org.apache.spark.util.Utils$$anonfun$localHostName$1.apply(Utils.scala:508)
    at scala.Option.getOrElse(Option.scala:120)
    at org.apache.spark.util.Utils$.localHostName(Utils.scala:508)
    at org.apache.spark.deploy.worker.WorkerArguments.<init>(WorkerArguments.scala:28)
    at org.apache.spark.deploy.worker.Worker$.main(Worker.scala:366)
    at org.apache.spark.deploy.worker.Worker.main(Worker.scala)
Caused by: java.lang.ClassNotFoundException: org.slf4j.impl.StaticLoggerBinder
    at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
    ... 26 more

I tried to install some slf4j or log4j libs to make it work but I don't make it work.

I tried also to play with classpath but with no success.

5 REPLIES 5

avatar
New Contributor

This error is reported when the org.slf4j.impl.StaticLoggerBinder class could not be loaded into memory. This happens when no appropriate SLF4J binding could be found on the class path. Placing one (and only one) of slf4j-nop.jar, slf4j-simple.jar, slf4j-log4j12.jar, slf4j-jdk14.jar or logback-classic.jar on the class path should solve the problem.

avatar
Explorer

I have a similar problem trying to run spark-shell from CDH 5.4 under 14.04. I was able to get around this by saying:

 

SPARK_PRINT_LAUNCH_COMMAND=true spark-shell

 

:to get the underlying java invocation and then tack an SLF4J jar onto the classpath there:

 

/usr/lib/jvm/java-7-oracle/bin/java -cp :/usr/lib/spark/conf:/usr/lib/spark/lib/spark-assembly-1.3.0-cdh5.4.0-hadoop2.6.0-cdh5.4.0.jar:/etc/hadoop/conf::/usr/lib/spark/lib/spark-assembly.jar::/usr/lib/hadoop/lib/*:/usr/lib/hadoop/*:/usr/lib/hadoop-hdfs/lib/*:/usr/lib/hadoop-hdfs/*:/usr/lib/hadoop-mapreduce/lib/*:/usr/lib/hadoop-mapreduce/*:/usr/lib/hadoop-yarn/lib/*:/usr/lib/hadoop-yarn/*:/usr/lib/hive/lib/*:/usr/lib/flume-ng/lib/*:/usr/lib/paquet/lib/*:/usr/lib/avro/lib/*:/usr/share/java/slf4j-simple.jar -XX:MaxPermSize=128m -Dscala.usejavacp=true -Xms512m -Xmx512m org.apache.spark.deploy.SparkSubmit --class org.apache.spark.repl.Main spark-shell

 

There must be something missing from one of the scripts or JARs in the CDH Ubuntu repo.  Any suggestions on how to hack it so that spark-shell works until this is addressed?

 

avatar
Explorer

Two solutions that I reasoned out by staring at the various scripts in /usr/lib/spark/bin

 

SPARK_CLASSPATH=/usr/share/java/slf4j-simple.jar spark-shell --master local

 

spark-shell --driver-class-path /usr/share/java/slf4j-simple.jar --master local

 

The first approach is deprecated according to the output.

avatar
Master Collaborator

So the major problem in this thread is that you're trying to manually install Spark from packages. If you do it that way, it takes a lot more work to set up the rest of its env variables and config. You could do that, but you haven't here it seems. CDH already sets up Spark for you, and I imagine it is only making the situation more complex, as the Debian packages are expecting their own custom setup and that's not CDH's. Don't do this. Just use CDH Spark. You're welcome to do what you like but it's not supported. This isn't the place to ask as you're not using CDH's Spark at all.

avatar
New Contributor

but installing CDH 5.5 using tarball will have the spark and other components of hadoop too? I installed CDH 5.5 using tarball without cloudera manager. But can not see any jar of spark or any other component. Pls suggest how can I make use of inbuilt components of hadoop in CDH