Created 07-21-2016 08:29 PM
I have got a local user on one of the Gateway Node and I am trying to query one of the HIVE table using the local user account. Looks like I need do not have enough permissions and following is the error message :
Job Submission failed with exception 'org.apache.hadoop.security.AccessControlException(org.apache.hadoop.security.AccessControlException: Permission denied: user=XXXXXX, access=WRITE, inode="/user":hdfs:hdfs:Drwxr-xr-x)' FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.MapRedTask
Please advise what should be my next step to get access
Created 07-21-2016 08:30 PM
Your user needs an hdfs user directory.
As hdfs create a directory for your OS user in /user (under hdfs) and grant ownership to your user on that /user/youruser folder.
As root do:
sudo su hdfs
hadoop fs -mkdir /user/youruser
hadoop fs -chown youruser /user/youruser
Try again with your OS user and if this response addressed your problem, please vote and accept it the best response.
Created 07-21-2016 08:30 PM
Your user needs an hdfs user directory.
As hdfs create a directory for your OS user in /user (under hdfs) and grant ownership to your user on that /user/youruser folder.
As root do:
sudo su hdfs
hadoop fs -mkdir /user/youruser
hadoop fs -chown youruser /user/youruser
Try again with your OS user and if this response addressed your problem, please vote and accept it the best response.
Created 07-21-2016 08:53 PM
This worked ..thanks Constantin
Created 07-21-2016 08:32 PM
Similar to this issue.
You are getting a "permission denied"-error because you are trying to access a folder that is owned by the hdfs-user and the permissions do not allow write access from others.
A) You could use the HDFS-user to run your application/script
su hdfs
or
export HADOOP_USER_NAME=hdfs
B) Change the owner of the your user (note: to change the owner you have to be a superuser or the owner => hdfs)
hdfs dfs -chown -R <username_of_new_owner> /user