- Subscribe to RSS Feed
- Mark Question as New
- Mark Question as Read
- Float this Question for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page
Permission denied: user=root, access=WRITE, inode="/user":hdfs:supergroup:dr
- Labels:
-
Apache Hadoop
-
HDFS
Created on ‎01-15-2014 09:58 PM - edited ‎09-16-2022 01:52 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
when i use hdfs comands it display errors:
hadoop fs -mkdir /user/,,myfile,, or "hadoop fs -put myfile.txt /user/,,/,,"
hadoop will display"Permission denied: user=root, access=WRITE, inode="/user":hdfs:supergroup:dr,,,"and so on,why?who can help me?
Created ‎01-16-2014 07:04 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
The /user/ directory is owned by "hdfs" with 755 permissions. As a result only hdfs can write to that directory. Unlike unix/linux, hdfs is the superuser and not root. So you would need to do this:
sudo -u hdfs hadoop fs -mkdir /user/,,myfile,,
sudo -u hdfs hadoop fs -put myfile.txt /user/,,/,,
If you want to create a home directory for root so you can store files in his directory, do:
sudo -u hdfs hadoop fs -mkdir /user/root
sudo -u hdfs hadoop fs -chown root /user/root
Then as root you can do "hadoop fs -put file /user/root/".
Hope this helps.
Chris
Created ‎01-16-2014 07:04 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
The /user/ directory is owned by "hdfs" with 755 permissions. As a result only hdfs can write to that directory. Unlike unix/linux, hdfs is the superuser and not root. So you would need to do this:
sudo -u hdfs hadoop fs -mkdir /user/,,myfile,,
sudo -u hdfs hadoop fs -put myfile.txt /user/,,/,,
If you want to create a home directory for root so you can store files in his directory, do:
sudo -u hdfs hadoop fs -mkdir /user/root
sudo -u hdfs hadoop fs -chown root /user/root
Then as root you can do "hadoop fs -put file /user/root/".
Hope this helps.
Chris
Created ‎10-15-2014 03:45 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I had this issue too in Dev during our automated oozie sharelib creation process--which was essentially doing this:
oozie-setup sharelib create -fs hdfs://devhost:8020 -locallib /usr/lib/oozie/oozie-sharelib-yarn.tar.gz
The error I was seeing was:
org.apache.hadoop.security.AccessControlException: Permission denied: user=oozie, access=WRITE, inode="/user":hdfs:supergroup:drwxr-xr-x
It wasn't exactly clear to me at first as it seemed like a permissions error given the error above.
For me it was solved by changing the address that the service was listening on in /etc/hadoop/conf/core-sitse.xml. Mine was previously listening on "localhost:8020." (127.0.0.1)
So to be clear my fix was this:
<configuration>
<property>
<name>fs.defaultFS</name>
<value>hdfs://0.0.0.0:8020</value>
</property>
&& then bounce the service with hadoop-hdfs-namenode restart
optional: validate with netstat -tupln | grep '8020'
Hope this helps someone else out.
-cmcc
Created ‎01-14-2016 01:20 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
A quick update to this thread to advise of a new Community Knowledge Article on this subject.
How to resolve "Permission denied" errors in CDH
Cy Jervis, Manager, Community Program
Was your question answered? Make sure to mark the answer as the accepted solution.
If you find a reply useful, say thanks by clicking on the thumbs up button.
Created ‎05-05-2017 11:01 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
thank u so much!!!!
Created ‎11-16-2017 01:21 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
export HADOOP_USER_NAME=hdfs
Created ‎12-04-2019 04:38 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
thanks. its worked nicely
Created ‎06-27-2023 01:54 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
thanks it solved the problem
Created ‎06-21-2016 10:09 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
In cloudera manager, go to hdfs configuration under advanced and put the following code in HDFS Service Configuration Safety Valve:
<property>
<name>dfs.permissions</name>
<value>false</value>
</property>
Created ‎06-01-2017 10:27 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
