Created 06-20-2021 04:26 AM
I installed hadoop-2.7.1 on a centos7 operating system.
when I execute the command # hdfs dfs -mkdir /user/Juste . I get an error
[hdfs@MASTER ~]$ hdfs dfs -mkdir /user/Juste
21/06/20 11:04:04 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
mkdir: `/user/Juste': No such file or directory
Created 06-20-2021 06:41 AM
Hi,
Hadoop has native implementations of certain components for both performance and non-availability of Java implementations. These components are available in a single, dynamically-linked native Linux library called the native hadoop library. On the *nix platforms the library is named libhadoop.so.
It is just a Warning but not an error and it can be ignored.
Please check if you have an issue with other hdfs commands as well...and check the hdfs logs and see if you can get exact errors.
Created 06-20-2021 07:04 AM
When I execute this command this is the output
[hdfs@MASTER ~]$ hdfs dfs -mkdir /user
21/06/20 13:51:54 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
mkdir: Cannot create directory /user. Name node is in safe mode
Created 06-20-2021 07:12 AM
Safemode in Hadoop is a maintenance state of NameNode, during which NameNode doesn’t allow any modifications to the file system.
can you use below command to come out of the safemode and try to create a directory?
## hadoop dfsadmin -safemode leave
## hdfs dfs -mkdir /user/justee
Created on 06-20-2021 10:48 AM - edited 06-20-2021 11:46 AM
Hi @ChethanYM , when i created hdfs user i didn't create with a password but when i execute this is what i'm asked. i used the root password but it doesn't pass either.
[hdfs@MASTER ~]$ hdfs dfsadmin -safemode leave
21/06/20 17:30:15 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
safemode: Access denied for user hdfs. Superuser privilege is required
[hdfs@MASTER ~]$ sudo hdfs dfsadmin -safemode leave
[sudo] password for hdfs:
sorry, try again.
[sudo] password for hdfs:
sorry, try again.
[sudo] password for hdfs:
Created 06-20-2021 07:59 PM
Hi,
Can you try with below command and see?
## sudo -u hdfs hdfs dfsadmin -safemode leave
If this doesn't work provide the output of below from the terminal:
## id hdfs
Created 06-21-2021 12:26 AM
Hi @ChethanYM, I tried the following command again another error
[root@MASTER ~]# su - hdfs
Last login: Sun Jun 20 17:27:39 UTC 2021 on pts/5
[hdfs@MASTER ~]$ sudo -u hdfs hdfs dfsadmin -safemode leave
sudo: hdfs: command not found
[hdfs@MASTER ~]$ id hdfs
uid=30008(hdfs) gid=30009(hdfs) groups=30009(hdfs)
[hdfs@MASTER ~]$
Created 06-21-2021 12:58 AM
You don't need to switch to hdfs user to run below command.
Try to run the command when you are root/sudo user
sudo -u hdfs hdfs dfsadmin -safemode leave
Created 06-21-2021 01:59 AM
Hi @RohitPathak, I tried with root/sudo user but still error,
[root@MASTER ~]# sudo -u hdfs hdfs dfsadmin -safemode leave
sudo: hdfs: command not found
[root@MASTER ~]#
Created 06-21-2021 06:31 AM
Seems to be hdfs is not installed/configured properly, Can you check the path of it by using "which hdfs" command.
If you are not able to see the path check the environment variable in "~/.bash_profile" file. set the path something like below and try:
## PATH=$PATH:$HADOOP_HOME/bin
then run -> source ~/.bash_profile
Below is the output from my test cluster:
[root@node2 bin]# which hdfs
/usr/bin/hdfs
[root@node2 bin]# sudo -u hdfs hdfs dfsadmin -safemode leave
Safe mode is OFF
[root@node2 bin]# id hdfs
uid=993(hdfs) gid=990(hdfs) groups=990(hdfs)