Created 02-08-2019 02:58 PM
I am getting error while installing namenode where it says "-safemode get | grep 'Safe mode is OFF''"
Ambari:2.7.0.0
HDP:3.0
I got the below error and I ran :
>hdfs dfsadmin -safemode leave >Output:safemode: Call From devaz01/10.161.137.4 to devaz01:8020 failed on connection exception: java.net.ConnectException: Connection refused; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused then I ran>chmod -R 644 /var/log/hadoop/hdfs Since then I am getting: Execution of 'ambari-sudo.sh su hdfs -l -s /bin/bash -c 'ulimit -c unlimited ; /usr/hdp/3.0.1.0-187/hadoop/bin/hdfs --config /usr/hdp/3.0.1.0-187/hadoop/conf --daemon start namenode'' returned 1. ERROR: Unable to write in /var/log/hadoop/hdfs. Aborting.
019-02-08 03:31:21,394 - Retrying after 10 seconds. Reason: Execution of '/usr/hdp/current/hadoop-hdfs-namenode/bin/hdfs dfsadmin -fs hdfs://machine:8020 -safemode get | grep 'Safe mode is OFF'' returned 1. safemode: Call From machine/xx.xxx.xxx.x to machine:8020 failed on connection exception: java.net.ConnectException: Connection refused; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused safemode: Call From machine/10.161.137.4 to machine:8020 failed on connection exception: java.net.ConnectException: Connection refused; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused
Created 02-11-2019 05:09 AM
As we see the error :
ERROR: Unable to write in /var/log/hadoop/hdfs. Aborting.
.
So please make sure that the dir permissions are correctly set as following:
# ls -ld /var/log/hadoop/hdfs drwxr-xr-x. 2 hdfs hadoop 8192 Feb 11 03:27 /var/log/hadoop/hdfs
.
If you see permissions are incorrectly set then please do the following and then try starting process again: (You executed chmod, better you change the ownership of the mentioned directories as following)
# chown -R hdfs:hadoop /var/log/hadoop
.
Then try your commands:
# su - hdfs # hdfs dfsadmin -safemode leave
Created 02-10-2019 08:54 PM
You are trying to run the dfsadmin command when the Namenode is not yet started. Please ensure the namendoe is in started status in Ambari
Created 02-11-2019 04:27 AM
I am getting the above error while starting the namenode from Ambari UI . When I click 'Start All' the services does not start because namenode is not getting started due to the above error.
Created 02-11-2019 05:09 AM
As we see the error :
ERROR: Unable to write in /var/log/hadoop/hdfs. Aborting.
.
So please make sure that the dir permissions are correctly set as following:
# ls -ld /var/log/hadoop/hdfs drwxr-xr-x. 2 hdfs hadoop 8192 Feb 11 03:27 /var/log/hadoop/hdfs
.
If you see permissions are incorrectly set then please do the following and then try starting process again: (You executed chmod, better you change the ownership of the mentioned directories as following)
# chown -R hdfs:hadoop /var/log/hadoop
.
Then try your commands:
# su - hdfs # hdfs dfsadmin -safemode leave
Created 02-12-2019 11:00 AM
Thanks @Jay Kumar SenSharma
That worked for me!
Created 02-24-2019 02:12 AM
Any updates did my response resolve the issue if so accept so the thread is marked as closed.