Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

Hdfs acl giving permission denied error for spark job

avatar
Expert Contributor

I have a user user1 with group group1

I use a third party tool which writes spark event logs to a directory in hdfs. Currently using the user home directory to write the logs temporarily.

I created this directory like:

Hdfs dfs -mkdir /user/user1/sparkeventlogs

Right now this is created under user1:group2

Changed ownership to the right group:

Hdfs dfs -chown -R user1:group1 /user/user1/sparkeventlogs

I also added ACL to the above directory using setfacl command and when I do getfacl it gives me correct user and group assigned. User and group both have rwx permissions

Now when the job is run, it the giving permission denied with a user who ran it under that AD group saying

User1:group2 drwx———

When actually it is

User1:group1 drwxrwx—-

We have ranger enabled but I don’t have access to it

Thanks for you help

1 REPLY 1

avatar
Super Collaborator

@PJ

Since you have ranger enabled, its possible that your permission is denied at Ranger end. I would definitely check the Ranger Audit logs for any events for the users and see if we are hitting the permission denied in there.

Also I would add a ranger hdfs policy to allow user user1 write access to /user/user1/sparkeventlogs once I validate it was Ranger who was blocking the permissions.