Reply
New Contributor
Posts: 2
Registered: ‎06-08-2015

Not able to access HDFS, getting Connection exception.

I started the Cloudera VM normally, But when I am doing a list on the files in HDFS, I am getting a Connection Exception as follows:

 

[cloudera@quickstart ~]$ hadoop fs -ls /user/
ls: Call From quickstart.cloudera/127.0.0.1 to quickstart.cloudera:8020 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused

 

I guess it is because of the Hadoop services not running but can somebody please suggest, when i am starting the CLoudera VM as required why getting this error?

 

Cloudera Employee
Posts: 435
Registered: ‎07-12-2013

Re: Not able to access HDFS, getting Connection exception.

All the basic Hadoop services should be running when you start the VM. Port 8020 is for the hadoop-hdfs-namenode service, so my guess is that service has failed and just needs to be restarted.

 

You can check the status of a service with

service <service-name> status

and you can restart a service with

service <service-name> restart

So 'service hadoop-hdfs-namenode restart' may be all you need. Also check the hadoop-hdfs-datanode service as it may also need to be restarted.

 

The services should have been running, so if they're not it means something went wrong. If you're curious or if you continue to have a problem, have a look at the NameNode logs in /var/log/hadoop-hdfs for anything that looks like a fatal error and post it back here.

 

New Contributor
Posts: 2
Registered: ‎06-08-2015

Re: Not able to access HDFS, getting Connection exception.

My datanode service is running fine but yes the namenode servoce is not running.

I restarted it but the restart is getting failed:

 

[root@quickstart cloudera]# service hadoop-hdfs-datanode status
Hadoop datanode is running                                 [  OK  ]
[root@quickstart cloudera]# service hadoop-hdfs-namenode restart
no namenode to stop
Stopped Hadoop namenode:                                   [  OK  ]
starting namenode, logging to /var/log/hadoop-hdfs/hadoop-hdfs-namenode-quickstart.cloudera.out
Failed to start Hadoop namenode. Return value: 1           [FAILED]

 

Please advice.

Explorer
Posts: 25
Registered: ‎12-10-2014

Re: Not able to access HDFS, getting Connection exception.

I have the same problem, tail command is getting this output:

 

2015-08-06 07:47:26,459 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: 0.0.0.0/0.0.0.0:8022
2015-08-06 07:47:32,462 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: 0.0.0.0/0.0.0.0:8022. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
2015-08-06 07:47:33,463 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: 0.0.0.0/0.0.0.0:8022. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
2015-08-06 07:47:34,464 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: 0.0.0.0/0.0.0.0:8022. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
2015-08-06 07:47:35,465 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: 0.0.0.0/0.0.0.0:8022. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
2015-08-06 07:47:36,466 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: 0.0.0.0/0.0.0.0:8022. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
2015-08-06 07:47:37,467 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: 0.0.0.0/0.0.0.0:8022. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
2015-08-06 07:47:38,468 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: 0.0.0.0/0.0.0.0:8022. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
2015-08-06 07:47:39,469 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: 0.0.0.0/0.0.0.0:8022. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
2015-08-06 07:47:40,471 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: 0.0.0.0/0.0.0.0:8022. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
2015-08-06 07:47:41,472 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: 0.0.0.0/0.0.0.0:8022. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
2015-08-06 07:47:41,473 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: 0.0.0.0/0.0.0.0:8022

Before i notice this problem i updated the centos image doing sudo yum update, 3GB of new data...

 

Is there any way to see whats going on with a graphical user interface?

 

 

New Contributor
Posts: 2
Registered: ‎02-14-2016

Re: Not able to access HDFS, getting Connection exception.

You can check the status of a service with

sudo service <service-name> status

and you can restart a service with

sudo service <service-name> restart

If you run the above command without sudo, you might get error message like Error: Root User required. 

New Contributor
Posts: 1
Registered: ‎09-23-2016

Re: Not able to access HDFS, getting Connection exception.

I am facing the same issue. Please help me if u have found some soution to this. Please help 

Explorer
Posts: 30
Registered: ‎02-01-2017

Re: Not able to access HDFS, getting Connection exception.

did you resolve the issue I am facing the same issue when trying to execute a command even after starting the service and having the status say okay.

New Contributor
Posts: 5
Registered: ‎02-08-2017

Re: Not able to access HDFS, getting Connection exception.

First please see the status of service using this command

sudo service hadoop-hdfs-<service_name> status;

ex- sudo service hadoop-hdfs-namenode status;

If status is stop , please try to start using below command

sudo service hadoop-hdfs-<service_name> start;

If it's running , first stop it and again restart this.

sudo service hadoop-hdfs-<service_name> stop;

sudo service hadoop-hdfs-<service_name> restart;

 

Hope it will work for you.

Explorer
Posts: 30
Registered: ‎02-01-2017

Re: Not able to access HDFS, getting Connection exception.

Thanks for the response this did not work for me unfortunately. This is what I tried.
First I checked the status it was not running then I started the service with
sudo service hadoop-hdfs-datanode start
Then tried hadoop fs -ls /
This gave me the same error as before. Do I need to also start a namenode or something but Im thinking I shouldnt because I am not in control of namenodes and on other coworkers computers it just works. Any suggestions are appreciated.
New Contributor
Posts: 5
Registered: ‎02-08-2017

Re: Not able to access HDFS, getting Connection exception.

Announcements