Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

HDFS Cannot Create the directory of the user

avatar
Explorer

I installed hadoop-2.7.1 on a centos7 operating system.
when I execute the command # hdfs dfs -mkdir /user/Juste . I get an error

 

[hdfs@MASTER ~]$ hdfs dfs -mkdir /user/Juste
21/06/20 11:04:04 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
mkdir: `/user/Juste': No such file or directory

14 REPLIES 14

avatar
Explorer

Hi @ChethanYM, I set the PATH  like below and try:
When I run this command, I get the same error.

## PATH=$PATH:$HADOOP_HOME/bin

## source ~/.bash_profile
[root@MASTER ~]# which hdfs
/usr/local/hadoop-2.7.1/bin/hdfs
[root@MASTER ~]# sudo -u hdfs hdfs dfsadmin -safemode leave
sudo: hdfs: command not found
[root@MASTER ~]# id hdfs
uid=30008(hdfs) gid=30009(hdfs) groups=30009(hdfs)
[root@MASTER ~]#

avatar
Super Collaborator

Hi,

Go to /etc/alterenatives provide the output for below commands: ( This is to check if linux alternatives subsystem is pointing to the binaries from an other old version of CDH that you are not using currently)

 

Example:

[root@node2 alternatives]# ls -lrth | grep hdfs

lrwxrwxrwx  1 root root 62 Aug 15  2020 hdfs -> /opt/cloudera/parcels/CDH-6.2.1-1.cdh6.2.1.p0.4951328/bin/hdfs

 

[root@node2 alternatives]# ls -lrth /usr/bin/hdfs

lrwxrwxrwx 1 root root 22 Aug 15  2020 /usr/bin/hdfs -> /etc/alternatives/hdfs

 

  1. What is the CDH version you are using currently? Have you recently upgraded the cluster?
  2. In my previous comment I have added “#” just for pointing the command if you have added same in the ~/.bash_profile do remove the “#” and try.
  3. How you installed hdfs? is it possible for you to re install it?

avatar
Explorer

Hi @ChethanYM, I installed hadoop-2.7.1 and it's during the installation that I created a hdfs user. but the hdfs command started working when I finished installing hadoop. here is the output of the command #ls -lrth | grep hdfs. the output does not give anything. Yes I can reinstall hadoop.  


[root@MASTER alternatives]# ls -lrth | grep hdfs
[root@MASTER alternatives]#
[root@MASTER alternatives]# hadoop version
Hadoop 2.7.1
Subversion https://git-wip-us.apache.org/repos/asf/hadoop.git -r 20fe5304904fc2f5a18053c389e43cd26f7a70fe
Compiled by vinodkv on 2017-06-02T06:14Z
Compiled with protoc 2.5.0
From source with checksum 60125541c2b3e266cbf3becc5bda666
This command was run using /usr/local/hadoop-2.7.1/share/hadoop/common/hadoop-common-2.7.1.jar
[root@MASTER alternatives]#

avatar
Super Collaborator

Yes, It seems to be not properly installed.

May i know you are using plain Hadoop or CDH or HDP to manage it?

If you have followed any document for the hadoop installation provide the link here...

avatar
Explorer

Hi @ChethanYM, I just use hadoop. I followed This tutorial , but I did the installation steps of hadoop only. https://www.youtube.com/watch?v=71EQblrUPRM&t=1375s