01-15-2014 09:58 PM
when i use hdfs comands it display errors:
hadoop fs -mkdir /user/,,myfile,, or "hadoop fs -put myfile.txt /user/,,/,,"
hadoop will display"Permission denied: user=root, access=WRITE, inode="/user":hdfs:supergroup:dr,,,"and so on,why?who can help me?
Solved! Go to Solution.
01-16-2014 07:04 AM
10-15-2014 03:45 PM
I had this issue too in Dev during our automated oozie sharelib creation process--which was essentially doing this:
oozie-setup sharelib create -fs hdfs://devhost:8020 -locallib /usr/lib/oozie/oozie-sharelib-yarn.tar.gz
The error I was seeing was:
org.apache.hadoop.security.AccessControlException: Permission denied: user=oozie, access=WRITE, inode="/user":hdfs:supergroup:drwxr-xr-x
It wasn't exactly clear to me at first as it seemed like a permissions error given the error above.
For me it was solved by changing the address that the service was listening on in /etc/hadoop/conf/core-sitse.xml. Mine was previously listening on "localhost:8020." (127.0.0.1)
So to be clear my fix was this:
&& then bounce the service with hadoop-hdfs-namenode restart
optional: validate with netstat -tupln | grep '8020'
Hope this helps someone else out.
01-14-2016 01:20 PM
A quick update to this thread to advise of a new Community Knowledge Article on this subject.
Cy Jervis, Community Moderator - I'm not an expert but will supply relevant content from time to time. :)
Learn more about the Cloudera Community:
06-21-2016 10:09 AM
In cloudera manager, go to hdfs configuration under advanced and put the following code in HDFS Service Configuration Safety Valve: