Support Questions

Find answers, ask questions, and share your expertise
Announcements
Welcome to the upgraded Community! Read this blog to see What’s New!

Connection refused error while starting namenode from Ambari UI

avatar

I am getting error while installing namenode where it says "-safemode get | grep 'Safe mode is OFF''"

Ambari:2.7.0.0

HDP:3.0

I got the below error and I ran :

>hdfs dfsadmin -safemode leave
>Output:safemode: Call From devaz01/10.161.137.4 to devaz01:8020 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused

then I ran>chmod -R 644 /var/log/hadoop/hdfs
Since then I am getting: Execution of 'ambari-sudo.sh su hdfs -l -s /bin/bash -c 'ulimit -c unlimited ;  /usr/hdp/3.0.1.0-187/hadoop/bin/hdfs --config /usr/hdp/3.0.1.0-187/hadoop/conf --daemon start namenode'' returned 1. ERROR: Unable to write in /var/log/hadoop/hdfs. Aborting.

019-02-08 03:31:21,394 - Retrying after 10 seconds. Reason: Execution of '/usr/hdp/current/hadoop-hdfs-namenode/bin/hdfs dfsadmin -fs hdfs://machine:8020 -safemode get | grep 'Safe mode is OFF'' returned 1. safemode: Call From machine/xx.xxx.xxx.x to machine:8020 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
safemode: Call From machine/10.161.137.4 to machine:8020 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused 
1 ACCEPTED SOLUTION

avatar
Super Mentor

@Shraddha Singh

As we see the error :

ERROR: Unable to write in /var/log/hadoop/hdfs. Aborting.

.

So please make sure that the dir permissions are correctly set as following:

# ls -ld /var/log/hadoop/hdfs
drwxr-xr-x. 2 hdfs hadoop 8192 Feb 11 03:27 /var/log/hadoop/hdfs

.

If you see permissions are incorrectly set then please do the following and then try starting process again: (You executed chmod, better you change the ownership of the mentioned directories as following)

# chown -R hdfs:hadoop /var/log/hadoop

.

Then try your commands:

# su - hdfs
# hdfs dfsadmin -safemode leave

.

View solution in original post

5 REPLIES 5

avatar
Mentor

@Shraddha Singh

You are trying to run the dfsadmin command when the Namenode is not yet started. Please ensure the namendoe is in started status in Ambari

avatar

Hi @Geoffrey Shelton Okot

I am getting the above error while starting the namenode from Ambari UI . When I click 'Start All' the services does not start because namenode is not getting started due to the above error.

avatar
Super Mentor

@Shraddha Singh

As we see the error :

ERROR: Unable to write in /var/log/hadoop/hdfs. Aborting.

.

So please make sure that the dir permissions are correctly set as following:

# ls -ld /var/log/hadoop/hdfs
drwxr-xr-x. 2 hdfs hadoop 8192 Feb 11 03:27 /var/log/hadoop/hdfs

.

If you see permissions are incorrectly set then please do the following and then try starting process again: (You executed chmod, better you change the ownership of the mentioned directories as following)

# chown -R hdfs:hadoop /var/log/hadoop

.

Then try your commands:

# su - hdfs
# hdfs dfsadmin -safemode leave

.

avatar

Thanks @Jay Kumar SenSharma

That worked for me!

avatar
Mentor

@Shraddha Singh

Any updates did my response resolve the issue if so accept so the thread is marked as closed.

Labels