Support Questions

Find answers, ask questions, and share your expertise

Hive Error: SLF4J: Class path contains multiple SLF4J bindings

avatar
Contributor

Hi All,

Trying to execute a hql with the following properties being set:

 SET hive.execution.engine=tez;
 SET hive.exec.compress.output=true;
 SET mapred.output.compression.codec=org.apache.hadoop.io.compress.SnappyCodec;
 SET mapred.output.compression.type=BLOCK;
 SET hive.vectorized.execution.enabled = true;
 SET hive.vectorized.execution.reduce.enabled = true;
 SET hive.vectorized.execution.reduce.groupby.enabled = true;
 SET mapred.job.queue.name=mtl;
 SET hive.cbo.enable=true;
 SET hive.compute.query.using.stats=true;
 SET hive.stats.fetch.column.stats=true;
 SET hive.stats.fetch.partition.stats=true;
 SET tez.am.container.reuse.enabled=false;
 SET hive.exec.dynamic.partition = true;
 SET hive.exec.dynamic.partition.mode = nonstrict;
 SET hive.exec.reducers.bytes.per.reducer=524288000;
 SET tez.queue.name= pete_spark;
 SET hive.txn.manager=org.apache.hadoop.hive.ql.lockmgr.DummyTxnManager

but getting the below error message:

Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=512m; support was removed in 8.0
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/hdp/2.4.2.0-258/hadoop/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/hdp/2.4.2.0-258/hive/lib/spark-assembly-1.6.1.2.4.2.0-258-hadoop2.7.1.2.4.2.0-258.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/hdp/2.4.2.0-258/hive/lib/spark-examples-1.6.1.2.4.2.0-258-hadoop2.7.1.2.4.2.0-258.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/hdp/2.4.2.0-258/hive/lib/spark-hdp-assembly.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
WARNING: Use "yarn jar" to launch YARN applications.
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=512m; support was removed in 8.0
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/hdp/2.4.2.0-258/hadoop/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/hdp/2.4.2.0-258/hive/lib/spark-assembly-1.6.1.2.4.2.0-258-hadoop2.7.1.2.4.2.0-258.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/hdp/2.4.2.0-258/hive/lib/spark-examples-1.6.1.2.4.2.0-258-hadoop2.7.1.2.4.2.0-258.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/hdp/2.4.2.0-258/hive/lib/spark-hdp-assembly.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
Logging initialized using configuration in file:/etc/hive/2.4.2.0-258/0/hive-log4j.properties
OK

Any resolution for this issue? Your help or suggestions are highly appreciated.

3 REPLIES 3

avatar
Super Guru
@Vijay Parmar

Here is your issue:

  1. SLF4J:Found binding in[jar:file:/usr/hdp/2.4.2.0-258/hadoop/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
  2. SLF4J:Found binding in[jar:file:/usr/hdp/2.4.2.0-258/hive/lib/spark-assembly-1.6.1.2.4.2.0-258-hadoop2.7.1.2.4.2.0-258.jar!/org/slf4j/impl/StaticLoggerBinder.class]
  3. SLF4J:Found binding in[jar:file:/usr/hdp/2.4.2.0-258/hive/lib/spark-examples-1.6.1.2.4.2.0-258-hadoop2.7.1.2.4.2.0-258.jar!/org/slf4j/impl/StaticLoggerBinder.class]
  4. SLF4J:Found binding in[jar:file:/usr/hdp/2.4.2.0-258/hive/lib/spark-hdp-assembly.jar!/org/slf4j/impl/StaticLoggerBinder.class]

You need to get rid of 2nd and 3rd and 4th. For now just move it to a backup location which is not in CLASSPATH. Then run this job. Then figure out if anything else is using these jar files (definitely not number 3 as its just examples jar). From name, it seems like you probably would never need it because they will be required for Hive on Spark which I am guessing you are not doing since you are using HDP which uses Tez and LLAP.

avatar
Contributor

@mqureshi If they are moved then the other applications may get affected. Is there any other way round to resolve it without affecting any other applications?

avatar
Super Guru

@Vijay Parmar

Are you using hive on spark? These libraries are under hive and if you are not using Hive on spark then your other applications should not be affected. Regardless, I am not asking you to delete. Just move to resolve this issue and then you can restore in the unlikely event of anything else getting impacted.