- Subscribe to RSS Feed
- Mark Question as New
- Mark Question as Read
- Float this Question for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page
Hdfs acl giving permission denied error for spark job
- Labels:
-
Apache Hadoop
Created ‎10-15-2018 10:06 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I have a user user1 with group group1
I use a third party tool which writes spark event logs to a directory in hdfs. Currently using the user home directory to write the logs temporarily.
I created this directory like:
Hdfs dfs -mkdir /user/user1/sparkeventlogs
Right now this is created under user1:group2
Changed ownership to the right group:
Hdfs dfs -chown -R user1:group1 /user/user1/sparkeventlogs
I also added ACL to the above directory using setfacl command and when I do getfacl it gives me correct user and group assigned. User and group both have rwx permissions
Now when the job is run, it the giving permission denied with a user who ran it under that AD group saying
User1:group2 drwx———
When actually it is
User1:group1 drwxrwx—-
We have ranger enabled but I don’t have access to it
Thanks for you help
Created ‎10-17-2018 12:11 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
@PJ
Since you have ranger enabled, its possible that your permission is denied at Ranger end. I would definitely check the Ranger Audit logs for any events for the users and see if we are hitting the permission denied in there.
Also I would add a ranger hdfs policy to allow user user1 write access to /user/user1/sparkeventlogs once I validate it was Ranger who was blocking the permissions.
