Support Questions

Find answers, ask questions, and share your expertise
Announcements
Check out our newest addition to the community, the Cloudera Data Analytics (CDA) group hub.

Permission denied: user=root, access=WRITE, inode="/user":hdfs:supergroup:dr

New Contributor

when i use hdfs comands it display errors:

 

hadoop fs -mkdir /user/,,myfile,, or   "hadoop fs -put myfile.txt /user/,,/,,"  

hadoop will display"Permission denied: user=root, access=WRITE, inode="/user":hdfs:supergroup:dr,,,"and so on,why?who can help me?

1 ACCEPTED SOLUTION

Expert Contributor
Hey,

The /user/ directory is owned by "hdfs" with 755 permissions. As a result only hdfs can write to that directory. Unlike unix/linux, hdfs is the superuser and not root. So you would need to do this:

sudo -u hdfs hadoop fs -mkdir /user/,,myfile,,
sudo -u hdfs hadoop fs -put myfile.txt /user/,,/,,

If you want to create a home directory for root so you can store files in his directory, do:

sudo -u hdfs hadoop fs -mkdir /user/root
sudo -u hdfs hadoop fs -chown root /user/root

Then as root you can do "hadoop fs -put file /user/root/".

Hope this helps.

Chris

View solution in original post

11 REPLIES 11

Expert Contributor
Hey,

The /user/ directory is owned by "hdfs" with 755 permissions. As a result only hdfs can write to that directory. Unlike unix/linux, hdfs is the superuser and not root. So you would need to do this:

sudo -u hdfs hadoop fs -mkdir /user/,,myfile,,
sudo -u hdfs hadoop fs -put myfile.txt /user/,,/,,

If you want to create a home directory for root so you can store files in his directory, do:

sudo -u hdfs hadoop fs -mkdir /user/root
sudo -u hdfs hadoop fs -chown root /user/root

Then as root you can do "hadoop fs -put file /user/root/".

Hope this helps.

Chris

New Contributor

I had this issue too in Dev during our automated oozie sharelib creation process--which was essentially doing this:

oozie-setup sharelib create -fs hdfs://devhost:8020 -locallib /usr/lib/oozie/oozie-sharelib-yarn.tar.gz

 

The error I was seeing was:

org.apache.hadoop.security.AccessControlException: Permission denied: user=oozie, access=WRITE, inode="/user":hdfs:supergroup:drwxr-xr-x

 

It wasn't exactly clear to me at first as it seemed like a permissions error given the error above.

 

For me it was solved by changing the address that the service was listening on in /etc/hadoop/conf/core-sitse.xml. Mine was previously listening on "localhost:8020." (127.0.0.1)

 

So to be clear my fix was this:

<configuration>
<property>
<name>fs.defaultFS</name>
<value>hdfs://0.0.0.0:8020</value>
</property>

 

&& then bounce the service with hadoop-hdfs-namenode restart
optional: validate with netstat -tupln | grep '8020' 

 

Hope this helps someone else out.

-cmcc

 

Community Manager

A quick update to this thread to advise of a new Community Knowledge Article on this subject.

How to resolve "Permission denied" errors in CDH


Cy Jervis, Manager, Community Program
Was your question answered? Make sure to mark the answer as the accepted solution.
If you find a reply useful, say thanks by clicking on the thumbs up button.

New Contributor

thank u so much!!!!

New Contributor
This is the solution.
export HADOOP_USER_NAME=hdfs

New Contributor

thanks. its worked nicely

New Contributor


In cloudera manager, go to hdfs configuration under advanced and put the following code in HDFS Service Configuration Safety Valve:

<property>
<name>dfs.permissions</name>
<value>false</value>
</property>

New Contributor
to be specific "HDFS Service Advanced Configuration Snippet (Safety Valve) for hdfs-site.xml"

New Contributor

Still cant copy files to the folder under user.

Champion

@Pandeyg2106 could you let us know the error and the user you are firing 

New Contributor
export HADOOP_USER_NAME=hdfs
Take a Tour of the Community
Don't have an account?
Your experience may be limited. Sign in to explore more.