Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

HDFS Cannot Create the directory of the user

avatar
Explorer

I installed hadoop-2.7.1 on a centos7 operating system.
when I execute the command # hdfs dfs -mkdir /user/Juste . I get an error

 

[hdfs@MASTER ~]$ hdfs dfs -mkdir /user/Juste
21/06/20 11:04:04 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
mkdir: `/user/Juste': No such file or directory

14 REPLIES 14

avatar
Super Collaborator

Hi,

Hadoop has native implementations of certain components for both performance and non-availability of Java implementations. These components are available in a single, dynamically-linked native Linux library called the native hadoop library. On the *nix platforms the library is named libhadoop.so.

It is just a Warning but not an error and it can be ignored.

 

Please check if you have an issue with other hdfs commands as well...and check the hdfs logs and see if you can get exact errors.

 

 

avatar
Explorer

When I execute this command this is the output

[hdfs@MASTER ~]$ hdfs dfs -mkdir /user
21/06/20 13:51:54 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
mkdir: Cannot create directory /user. Name node is in safe mode

avatar
Super Collaborator

Safemode in Hadoop is a maintenance state of NameNode, during which NameNode doesn’t allow any modifications to the file system.

can you use below command to come out of the safemode and try to create a directory?

 

## hadoop dfsadmin -safemode leave

## hdfs dfs -mkdir /user/justee

avatar
Explorer

Hi @ChethanYM , when i created hdfs user i didn't create with a password but when i execute this is what i'm asked. i used the root password but it doesn't pass either.

[hdfs@MASTER ~]$ hdfs dfsadmin -safemode leave
21/06/20 17:30:15 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
safemode: Access denied for user hdfs. Superuser privilege is required
[hdfs@MASTER ~]$ sudo hdfs dfsadmin -safemode leave
[sudo] password for hdfs:
sorry, try again.
[sudo] password for hdfs:
sorry, try again.
[sudo] password for hdfs:

avatar
Super Collaborator

Hi,

Can you try with below command and see?

## sudo -u hdfs hdfs dfsadmin -safemode leave

 

If this doesn't work provide the output of below from the terminal:

## id hdfs

avatar
Explorer

Hi @ChethanYM, I tried the following command again another error

[root@MASTER ~]# su - hdfs
Last login: Sun Jun 20 17:27:39 UTC 2021 on pts/5
[hdfs@MASTER ~]$ sudo -u hdfs hdfs dfsadmin -safemode leave
sudo: hdfs: command not found
[hdfs@MASTER ~]$ id hdfs
uid=30008(hdfs) gid=30009(hdfs) groups=30009(hdfs)
[hdfs@MASTER ~]$

avatar
New Contributor

You don't need to switch to hdfs user to run below command.

 

Try to run the command when you are root/sudo user

 

sudo -u hdfs hdfs dfsadmin -safemode leave

avatar
Explorer

Hi @RohitPathak, I tried with root/sudo user but still error,
[root@MASTER ~]# sudo -u hdfs hdfs dfsadmin -safemode leave
sudo: hdfs: command not found
[root@MASTER ~]#

avatar
Super Collaborator

Seems to be hdfs is not installed/configured properly, Can you check the path of it by using "which hdfs" command.

If you are not able to see the path check the environment variable in "~/.bash_profile" file.  set the path something like below and try:

## PATH=$PATH:$HADOOP_HOME/bin

 

then run -> source ~/.bash_profile

 

Below is the output from my test cluster:

[root@node2 bin]# which hdfs

/usr/bin/hdfs

[root@node2 bin]# sudo -u hdfs hdfs dfsadmin -safemode leave

Safe mode is OFF

[root@node2 bin]# id hdfs

uid=993(hdfs) gid=990(hdfs) groups=990(hdfs)