Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

hadoop SLF4J: Class path contains multiple SLF4J bindings. error

Solved Go to solution
Highlighted

hadoop SLF4J: Class path contains multiple SLF4J bindings. error

New Contributor

Hi folks, I have installed a single node hadoop cluster on my local machine. I'm using it for 1-2 months now and I have execute few MR codes without any problem. But recently while studying Pig, I found out that I was unable to start the job history server. To solve this problem, I added below lines in the mapred-site.xml:

<property>
   <name>mapreduce.jobhistory.address</name>
   <value>localhost:10020</value> <!-- hostname of machine  where jobhistory service is started -->
</property>
<property>
   <name>mapreduce.jobhistory.webapp.address</name>
   <value>localhost:19888</value> 
</property>

But after making the above changes, I'm getting the below error while executing basic HDFS commands

SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/share/java/slf4j/slf4j-log4j12.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/share/java/slf4j/slf4j-jdk14.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
18/12/23 22:22:23 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
ls: Call From localhost/127.0.0.1 to localhost:8020 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused


Even a basic hadoop fs throws the above error. I have removed the above configuration from mapred-site.xml, tried formatting namenode but still the same issue. Can someone please help me as I'm stuck with this for very long

1 ACCEPTED SOLUTION

Accepted Solutions
Highlighted

Re: hadoop SLF4J: Class path contains multiple SLF4J bindings. error

Super Mentor

@Atharva Vadapurkar

You are getting error as following, Hence please check "dfs.namenode.rpc-address" (in Advanced hdfs-site) and "fs.defaultFS" in (Advanced core-site) property ports and hostname.

Call From localhost/127.0.0.1 to localhost:8020 failed on connection exception: java.net.ConnectException: Connection refused;   



Notice that the port 8020 is not being accessible on "localhost". So please check your HDFS configuration to find out if your NameNode is listening to "localhost:8020" address or not?

Ambari UI --> HDFS --> Configs --> Advanced -->  Advanced hdfs-site --> "dfs.namenode.rpc-address"


Please verify your "/etc/hosts" file as well to confirm that you have the "localhost" entry properly setup (See the Note from Link:

https://docs.hortonworks.com/HDPDocuments/Ambari-2.6.2.0/bk_ambari-installation-ppc/content/edit_the... and the NameNode port 8020 is opened and listening on "localhost"

# cat /etc/hosts
# netstat -tnlpa | grep 8020

.

View solution in original post

4 REPLIES 4
Highlighted

Re: hadoop SLF4J: Class path contains multiple SLF4J bindings. error

Super Mentor

@Atharva Vadapurkar

You are getting error as following, Hence please check "dfs.namenode.rpc-address" (in Advanced hdfs-site) and "fs.defaultFS" in (Advanced core-site) property ports and hostname.

Call From localhost/127.0.0.1 to localhost:8020 failed on connection exception: java.net.ConnectException: Connection refused;   



Notice that the port 8020 is not being accessible on "localhost". So please check your HDFS configuration to find out if your NameNode is listening to "localhost:8020" address or not?

Ambari UI --> HDFS --> Configs --> Advanced -->  Advanced hdfs-site --> "dfs.namenode.rpc-address"


Please verify your "/etc/hosts" file as well to confirm that you have the "localhost" entry properly setup (See the Note from Link:

https://docs.hortonworks.com/HDPDocuments/Ambari-2.6.2.0/bk_ambari-installation-ppc/content/edit_the... and the NameNode port 8020 is opened and listening on "localhost"

# cat /etc/hosts
# netstat -tnlpa | grep 8020

.

View solution in original post

Highlighted

Re: hadoop SLF4J: Class path contains multiple SLF4J bindings. error

New Contributor

Thanks for the response @Jay Kumar SenSharma. The port 8020 is not listening, I'll look into this issue

Highlighted

Re: hadoop SLF4J: Class path contains multiple SLF4J bindings. error

Super Mentor

@Atharva Vadapurkar

If the port 8020 is not opened then the best place will be to look at the NameNode startup logs to findout if there are any errors logged or if any tuning needed for NameNode.

Please share the NameNode logs as well.

Highlighted

Re: hadoop SLF4J: Class path contains multiple SLF4J bindings. error

New Contributor

@Jay Kumar SenSharma I was getting the directory is in an inconsistent state error. I made the necessary changes in hdfs-site.xml and everything is working fine now. Thanks for the help! :)

Don't have an account?
Coming from Hortonworks? Activate your account here