Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

Permission denied: user=root, access=WRITE, inode="/user":hdfs:supergroup:dr

avatar
New Contributor

when i use hdfs comands it display errors:

 

hadoop fs -mkdir /user/,,myfile,, or   "hadoop fs -put myfile.txt /user/,,/,,"  

hadoop will display"Permission denied: user=root, access=WRITE, inode="/user":hdfs:supergroup:dr,,,"and so on,why?who can help me?

1 ACCEPTED SOLUTION

avatar
Super Collaborator
Hey,

The /user/ directory is owned by "hdfs" with 755 permissions. As a result only hdfs can write to that directory. Unlike unix/linux, hdfs is the superuser and not root. So you would need to do this:

sudo -u hdfs hadoop fs -mkdir /user/,,myfile,,
sudo -u hdfs hadoop fs -put myfile.txt /user/,,/,,

If you want to create a home directory for root so you can store files in his directory, do:

sudo -u hdfs hadoop fs -mkdir /user/root
sudo -u hdfs hadoop fs -chown root /user/root

Then as root you can do "hadoop fs -put file /user/root/".

Hope this helps.

Chris

View solution in original post

12 REPLIES 12

avatar
Super Collaborator
Hey,

The /user/ directory is owned by "hdfs" with 755 permissions. As a result only hdfs can write to that directory. Unlike unix/linux, hdfs is the superuser and not root. So you would need to do this:

sudo -u hdfs hadoop fs -mkdir /user/,,myfile,,
sudo -u hdfs hadoop fs -put myfile.txt /user/,,/,,

If you want to create a home directory for root so you can store files in his directory, do:

sudo -u hdfs hadoop fs -mkdir /user/root
sudo -u hdfs hadoop fs -chown root /user/root

Then as root you can do "hadoop fs -put file /user/root/".

Hope this helps.

Chris

avatar
New Contributor

I had this issue too in Dev during our automated oozie sharelib creation process--which was essentially doing this:

oozie-setup sharelib create -fs hdfs://devhost:8020 -locallib /usr/lib/oozie/oozie-sharelib-yarn.tar.gz

 

The error I was seeing was:

org.apache.hadoop.security.AccessControlException: Permission denied: user=oozie, access=WRITE, inode="/user":hdfs:supergroup:drwxr-xr-x

 

It wasn't exactly clear to me at first as it seemed like a permissions error given the error above.

 

For me it was solved by changing the address that the service was listening on in /etc/hadoop/conf/core-sitse.xml. Mine was previously listening on "localhost:8020." (127.0.0.1)

 

So to be clear my fix was this:

<configuration>
<property>
<name>fs.defaultFS</name>
<value>hdfs://0.0.0.0:8020</value>
</property>

 

&& then bounce the service with hadoop-hdfs-namenode restart
optional: validate with netstat -tupln | grep '8020' 

 

Hope this helps someone else out.

-cmcc

 

avatar
Community Manager

A quick update to this thread to advise of a new Community Knowledge Article on this subject.

How to resolve "Permission denied" errors in CDH


Cy Jervis, Manager, Community Program
Was your question answered? Make sure to mark the answer as the accepted solution.
If you find a reply useful, say thanks by clicking on the thumbs up button.

avatar
New Contributor

thank u so much!!!!

avatar
New Contributor
This is the solution.
export HADOOP_USER_NAME=hdfs

avatar
New Contributor

thanks. its worked nicely

avatar
New Contributor

thanks it solved the problem

avatar
New Contributor


In cloudera manager, go to hdfs configuration under advanced and put the following code in HDFS Service Configuration Safety Valve:

<property>
<name>dfs.permissions</name>
<value>false</value>
</property>

avatar
New Contributor
to be specific "HDFS Service Advanced Configuration Snippet (Safety Valve) for hdfs-site.xml"