Reply
New Contributor
Posts: 2
Registered: ‎08-02-2017

java.lang.NoClassDefFoundError: org/slf4j/impl/StaticLoggerBinder when launching spark2

I installed cdh 5.11.0, along with spark 2 (according to this doc: https://www.cloudera.com/documentation/spark2/latest/topics/spark2_installing.html). However, when I tried to launch a spark job using spark-submit2, the error below occurred:

 

Exception in thread "main" java.lang.NoClassDefFoundError: org/slf4j/impl/StaticLoggerBinder
	at org.apache.spark.internal.Logging$class.initializeLogging(Logging.scala:111)
	at org.apache.spark.internal.Logging$class.initializeLogIfNecessary(Logging.scala:102)
	at org.apache.spark.deploy.yarn.ApplicationMaster$.initializeLogIfNecessary(ApplicationMaster.scala:746)
	at org.apache.spark.internal.Logging$class.log(Logging.scala:46)
	at org.apache.spark.deploy.yarn.ApplicationMaster$.log(ApplicationMaster.scala:746)
	at org.apache.spark.deploy.yarn.ApplicationMaster$.main(ApplicationMaster.scala:761)
	at org.apache.spark.deploy.yarn.ExecutorLauncher$.main(ApplicationMaster.scala:795)
	at org.apache.spark.deploy.yarn.ExecutorLauncher.main(ApplicationMaster.scala)
Caused by: java.lang.ClassNotFoundException: org.slf4j.impl.StaticLoggerBinder
	at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:335)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
	... 8 more

 

I realized there was something wrong with the classpath, but can I fix it with the Cloudera Manager? Or should I go and classpath manually on each nodes?

Cloudera Employee
Posts: 31
Registered: ‎11-16-2015

Re: java.lang.NoClassDefFoundError: org/slf4j/impl/StaticLoggerBinder when launching spark2

 

Caused by: java.lang.ClassNotFoundException: org.slf4j.impl.StaticLoggerBinder

  ^ This is generally an indication of a non-existing or incorrect Hadoop/Spark2 client configuration 

 

  I'd make sure that the Spark2 gateway role is added to the node from where you're running spark2-submit.

 

  https://www.cloudera.com/documentation/spark2/latest/topics/spark2_installing.html

  |_When configuring the assignment of role instances to hosts, add a gateway role to every host

 

  If you've already done that, please share the output of: 

  # alternatives --display spark2-conf

  

Example from a working node
# alternatives --display spark2-conf
spark2-conf - status is auto.
 link currently points to /etc/spark2/conf.cloudera.spark2_on_yarn
/opt/cloudera/parcels/SPARK2-2.2.0.cloudera1-1.cdh5.12.0.p0.142354/etc/spark2/conf.dist - priority 10
/etc/spark2/conf.cloudera.spark2_on_yarn - priority 51
Current `best' version is /etc/spark2/conf.cloudera.spark2_on_yarn.

# grep slf4j-log4j /etc/spark2/conf.cloudera.spark2_on_yarn/classpath.txt
/opt/cloudera/parcels/CDH-5.11.0-1.cdh5.11.0.p0.34/jars/slf4j-log4j12-1.7.5.jar

# ls -l /opt/cloudera/parcels/CDH-5.11.0-1.cdh5.11.0.p0.34/jars/slf4j-log4j12-1.7.5.jar
-rw-r--r-- 1 root root 8869 Apr 5 21:44 /opt/cloudera/parcels/CDH-5.11.0-1.cdh5.11.0.p0.34/jars/slf4j-log4j12-1.7.5.jar

 

New Contributor
Posts: 2
Registered: ‎08-02-2017

Re: java.lang.NoClassDefFoundError: org/slf4j/impl/StaticLoggerBinder when launching spark2

I re-added the entire spark2 service and deployed to all hosts. The output of the command 'alternatives --display spark2-conf' is as below: spark2-conf - status is auto. link currently points to /etc/spark2/conf.cloudera.spark2_on_yarn /opt/cloudera/parcels/SPARK2-2.2.0.cloudera1-1.cdh5.12.0.p0.142354/etc/spark2/conf.dist - priority 10 /etc/spark2/conf.cloudera.spark2_on_yarn - priority 51 Current `best' version is /etc/spark2/conf.cloudera.spark2_on_yarn. I only tried to submit a job on one node before I asked the question, so this time I decided to submit the job on several other nodes. The result is that it seems all other nodes is ok expect for the node I used before. The node I failed is where the spark2's history server is running on, the it's not a zookeeper server node. Is this normal? For my understanding, each node should be available for submitting jobs.
Cloudera Employee
Posts: 31
Registered: ‎11-16-2015

Re: java.lang.NoClassDefFoundError: org/slf4j/impl/StaticLoggerBinder when launching spark2

Having an SHS (Spark2 History Server) role or not having a ZK (zookeeper) role shouldn't affect the spark job. All we require is a Spark2 gateway role on the node from where you're running the spark2 job. Given other nodes are able to launch the same job, odds are high that we have a problem with client configuration or classpath on this node particularly. BTW, the spark2-conf looks fine.

 

Can you please help confirm if you are able to run a simple spark pi job, or does that fails too with the same message?

 

$ spark2-submit --deploy-mode client --class org.apache.spark.examples.SparkPi /opt/cloudera/parcels/SPARK2/lib/spark2/examples/jars/spark-examples*.jar 10 10

 

17/07/10 04:14:53 INFO spark.SparkContext: Running Spark version 2.1.0.cloudera1

....

Pi is roughly 3.1397831397831397

Announcements