Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

:8020 failed on connection exception: java.net.ConnectException: Connection refused

avatar
New Contributor

I have installed Cloudera 5.9 on a single node cluster ( OS : RHEL 7.2).Cluster was functional and working fine.

Suddenly while running a Spark job, it went into a bad health.

If I run any hadoop command gives below error

Ex: sudo -u hdfs hadoop fs -ls /tmp

:8020 failed on connection exception: java.net.ConnectException: Connection refused; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused

 

Tried to change the port from 8020 to 9000. Still returns the same error.

jps command shows

sudo jps
15649 Main
19394 Jps

 

Cloudera manager UI is running but all the components are in Unknown health.

Below messages are seen

Request to the Service Monitor failed. This may cause slow page responses. View the status of the Service Monitor.

Request to the Host Monitor failed. This may cause slow page responses. View the status of the Host Monitor.

 

Please help.

 

Thanks

Devendra

4 REPLIES 4

avatar
Rising Star

Hi, was You able to fix the issue?

 

We have the same problem.

avatar

Hello guys even  I got the same issue while running Apache Hadoop in my machine. When I tried to access HDFS I got following error

 

ls: Call From chinni/172.17.0.1 to chinni:9000 failed on connection exception: java.net.ConnectException: Connection refused; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused

 

Can anybody tell the solution.

Thank you!

avatar
New Contributor

A common cause for this is the Hadoop service isn't running.

 

To start hadoop services in cloudera quickstart VM, you can use below commands:

sudo service hadoop-hdfs-datanode start
sudo service hadoop-hdfs-journalnode start
sudo service hadoop-hdfs-namenode start
sudo service hadoop-hdfs-secondarynamenode start
sudo service hadoop-httpfs start

 

sudo service hadoop-yarn-nodemanager start
sudo service hadoop-yarn-resourcemanager start
sudo service hadoop-mapreduce-historyserver start

 

and then try your hadoop commands.

Hope it will work.

for more details visit https://wiki.apache.org/hadoop/ConnectionRefused

avatar
New Contributor

Check IP address mapping in /etc/hosts

 

If you are using floating IP to map public and private IP, update /etc/hosts with only private IP address. The IP in /etc/hosts should be the one that displays with ifconfig command.