Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

Not able to execute hadoop fs -du -h / as root user

Not able to execute hadoop fs -du -h / as root user

Hi All, 

 

I am trying to execute command # hadoop fs -du -h /  as root but able to execute the same as hdfs user . 

 

What configuration need to be changed in order to execute above command with root as well. 

 

Thanks - Kishore 

7 REPLIES 7

Re: Not able to execute hadoop fs -du -h / as root user

Explorer

if the error message is "du: Permission denied:". Your hdfs user which is basically the unix user account for edge node should have the respective group assigned. The group is mentioned in the error message.

Re: Not able to execute hadoop fs -du -h / as root user

Hi Naane, 

 

This is my output for my command  - 

du: Permission denied: user=ganta.k, access=READ_EXECUTE, inode="/data/one":hive:hive:drwxrwx---:user:hive:rwx,group::r-x,group:BIGG-D-ONEXXX-AD:rwx,default:user::rwx,default:user:hive:rwx,default:group::r-x,default:group:BIGG-D-ONEXXX-AD:rwx,default:mask::rwx,default:other::---

du: Permission denied: user=ganta.k, access=READ_EXECUTE, inode="/projects/regtst":hive:hive:drwxrwx---:user:hive:rwx,group::r-x,group:BIGG-P-REGTST-AD:rwx,default:user::rwx,default:user:hive:rwx,default:group::r-x,default:group:BIGG-P-REGTST-AD:rwx,default:mask::rwx,default:other::---

du: Permission denied: user=ganta.k, access=READ_EXECUTE, inode="/tmp/logs/wei.fa.2":wei.fa.2:hadoop:drwxrwx---

du: Permission denied: user=ganta.k, access=READ_EXECUTE, inode="/user/wei.fa.2/.staging":wei.fa.2:wei.fa.2:drwx------

5.3 K  7.5 G  /hbase

0      0      /system

 

Please suggest, what need to modified. FYI, this is from cluster nodes not from edge nodes 

 

Thanks

Kishore

Re: Not able to execute hadoop fs -du -h / as root user

Explorer

Kishore,

 

per Harsh input, hdfs is superuser on HDFS. I think similar access to a user would suffice as permission related exception would be expected with any user(root) on HDFS.

 

Best regards,

Anand

Re: Not able to execute hadoop fs -du -h / as root user

Community Manager
To break down the errors you are seeing:

du: Permission denied: user=ganta.k, access=READ_EXECUTE,
inode="/data/one":hive:hive:drwxrwx---:user:hive:rwx,group::r-x,group:BIGG-D-ONEXXX-AD:rwx,default:user::rwx,default:user:hive:rwx,default:group::r-x,default:group:BIGG-D-ONEXXX-AD:rwx,default:mask::rwx,default:other::---


user=ganta.k - You are running the command as the user "ganta.k"
access=READ_EXECUTE - The du command needs to have 'read' access to see the
file size and 'execute' to traverse the directory tree
inode="/data/one" - The directory or file where you are receiving
"Permission denied"
hive:hive:drwxrwx--- - The UNIX user, group, and permissions of the file
or directory you are attempting to access. With ACLs defined additional
permissions are required.

ACLs are enabled so the rest of the ACLs are listed:

user:hive:rwx
group::r-x
group:BIGG-D-ONEXXX-AD:rwx
default:user::rwx
default:user:hive:rwx
default:group::r-x
default:group:BIGG-D-ONEXXX-AD:rwx
default:mask::rwx
default:other::--

"default" ACLs are ACLs that will be inherited upon creation of a file or
directory within this directory so do not apply to the current access.




David Wilder, Community Manager


Was your question answered? Make sure to mark the answer as the accepted solution.
If you find a reply useful, say thanks by clicking on the thumbs up button.

Learn more about the Cloudera Community:

Terms of Service

Community Guidelines

How to use the forum

Re: Not able to execute hadoop fs -du -h / as root user

Master Guru
The 'hdfs' is the superuser of a HDFS service, not 'root'. So a permission related exception may be expected when running commands as 'root' on HDFS.

In other words, on HDFS, the 'root'-equivalent user is 'hdfs' (cause the HDFS service runs as this username).

Re: Not able to execute hadoop fs -du -h / as root user

Hi Harsha, 

 

Greeting for the day

 

Can you give some idea on this  issue - 

 

Data in .Trash directory is not getting auto deleted even though fs.trash.interval is set to 1 day.

 

Data is getting accumulated and disk is getting filled up .

 

Is there anything else to set in order to remove data from trash folder ? 

 

Thanks

Kishore 

Re: Not able to execute hadoop fs -du -h / as root user

Hi All,

 

Thanks for the reply .  Let me summarize, 2 clusters in my environment. 

 

[root@xxxxxx ~]# groups hdfs
hdfs : hadoop oinstall hdfs hdfs_superuser hdfs_admin

[root@xxxxxx ~]# groups root
root : root bin daemon sys adm disk wheel

 

For the above cluster, I am able to execute hadoop fs -du -h /  both as root & hdfs user. But for below cluster I am not able to execute as root. 

 

[root@xxxxxx ~]# groups hdfs
hdfs : hadoop oinstall hdfs
[root@xxxxx ~]# groups root
root : root

Can we do any modifications and make this work.

 

Thanks

Kishore