Archives of Support Questions (Read Only)

This is an archived board for historical reference. Information and links may no longer be available or relevant
Announcements
This board is archived and read-only for historical reference. To ask a new question, please post a new topic on the appropriate active board.

HDFS ACLs

avatar
Explorer

Hello, I trying to use Acls in My cluster Cloudera. I modified /etc/hadoop/conf/hdfs-site.xml property to true

<property>
<name>dfs.namenode.acls.enabled</name>
<value>true</value>
</property>

but when I execute a acls command I receive the next message

setfacl: The ACL operation has been rejected. Support for ACLs has been disabld by setting dfs.namenode.acls.enabled to false.

Thanks,

1 ACCEPTED SOLUTION

avatar
Master Collaborator

In you cloudera-quickstart-vm-5.5.0-0-virtualbox over virtualbox, open the web browser navigate and login to Cloudera Manager. In Cloudera Manager Server Homepage using your keyboard hit the Search Hotkey " / " (a forward slash without the quotes). Then again using your keboard type acl - from the drop-down list select the 'hdfs: Enable Access Control Lists' < this should take you to the relevant configuration page where you have to click on the checkbox > [Save] and restart the hdfs service.

View solution in original post

5 REPLIES 5

avatar
Master Collaborator

Is your cluster managed by Cloudera Manager server, or are you managing the Hadoop configurations using an editor?

avatar
Explorer

Hi Michalis, I'm using cloudera-quickstart-vm-5.5.0-0-virtualbox over virtualbox.

 

I'm sure who is managing the cluster.

 

Regards,

 

 

avatar
Master Collaborator

In you cloudera-quickstart-vm-5.5.0-0-virtualbox over virtualbox, open the web browser navigate and login to Cloudera Manager. In Cloudera Manager Server Homepage using your keyboard hit the Search Hotkey " / " (a forward slash without the quotes). Then again using your keboard type acl - from the drop-down list select the 'hdfs: Enable Access Control Lists' < this should take you to the relevant configuration page where you have to click on the checkbox > [Save] and restart the hdfs service.

avatar
Explorer

Michalis, thanks for your help. It's OK.

 

But now I have other problem. I have two user created into OS LInux, and two groups. 

 

cgmuros -> grp_cl_developer

cgmurosbrasil -> grp_br_developer

 

I defined the next permissions.

 

# /chile/desarrollo/000_int_project drwxr-x---  hdfs:grp_cl_developer

 

I execute the next command to assign read and execute to cgmurosbrasil user

# sudo -u hdfs hadoop fs -setfacl -R -m user:cgmurosbrasil:r-x /chile/desarrollo/000_int_project

 

When I use the cgmurosbrasil and execute the next command I receive the next message

# hadoop fs -ls /chile/desarrollo/000_int_project

 

"Permission denied: user=cgmurosbrasil, access=EXECUTE, inode="/chile":hdfs:grp_cl_developer:drwxr-x---"

 

Thanks for your time.

 

avatar
Explorer

I fixed assigning r-x to chile directory.

 

# sudo -u hdfs hadoop fs -setfacl -R -m user:cgmurosbrasil:r-x /chile

 

Thanks.