Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

How i Configure HDFS ACLs on Amazon

avatar

Hi Guys,
I have started topic "Configure HDFS ACLs". I am using Amazon Machine.
How i create a new user ? and other configurations ?
Thanks

1 ACCEPTED SOLUTION

avatar
Master Mentor

@Mudassar Hussain

Assuming no Kerberos but you want your user to access the HDP cluster ,usually, the local users are on the edgenode. To apply HDFS ACL the local user should have a home in hdfs

Create a local user on edge node here my user toto doesn't belong to any group for demo purposes.

# useradd toto

Before you can implement HDFS acl's you MUST add the below property in hdfs-site.xml or custom-hdfs-site in the namenode the default value is false, then restart the all the stale configs

dfs.namenode.acls.enabled=true

As the HDFS user create a directory acldemo in toto user home in HDFS

$ hdfs dfs -mkdir /user/toto/acldemo

As HDFS user change the ownership

$hdfs dfs -chown toto:hdfs  /user/toto/acldemo

created 3 dummy files and copied then to hdfs

$ hdfs dfs -put test2.txt test3.json test.txt /user/toto/acldemo 

Validate the copy process

$ hdfs dfs -ls /user/toto/acldemo 
-rw-r--r-- 3 hdfs hdfs 0 2017-12-12 13:38 /user/toto/acldemo/test.txt 
-rw-r--r-- 3 hdfs hdfs 0 2017-12-12 13:38 /user/toto/acldemo/test2.txt 
-rw-r--r-- 3 hdfs hdfs 0 2017-12-12 13:38 /user/toto/acldemo/test3.json

Set ACL on the directory acldemo for different users namely toto,hive,kafka to see all the subcommands type hdfs dfs and hit ENTER

user toto has RWX

$ hdfs dfs -setfacl -m user:toto:--- /user/toto/acldemo 

User hive has Read Write

$ hdfs dfs  -setfacl -m user:hive:rwx  /user/toto/acldemo 

User Kafka has only READ

$ hdfs dfs  -setfacl -m user:kafka:r-x  /user/toto/acldemo 

To check the current ACL's

$ hdfs dfs  -getfacl /user/toto/acldemo
# file: /user/toto/acldemo
# owner: toto
# group: hdfs
user::rwx
user:hive:rwx
user:kafka:r-x
user:toto:---
group::r-x
mask::rwx
other::r-x

Now to check whether the permissions work

For user Kafka he can read but NOT copy any files to

[kafka@host]$ hdfs dfs -put kafak.txt  /user/toto/acldemo
put: Permission denied: user=kafka, access=WRITE, inode="/user/toto/acldemo/kafak.txt._COPYING_":toto:hdfs:drwxrwxr-x
[kafka@host ~]$ hdfs dfs -cat  /user/toto/acldemo/test.txt
If you can read me then you have the correct permisions

User toto has no permissions !!

[toto@host]$ hdfs dfs -cat /user/toto/acldemo/test.txt cat: Permission denied: user=toto, access=EXECUTE, inode="/user/toto/acldemo/test.txt":toto:hdfs:drwxrwxr-x

For user hive exit code 0 "success" because it can read the contents of the text.txt file in hdfs

[hive@host]$ hdfs dfs -cat /user/toto/acldemo/test.txt
If you can read me then you have the correct permisions

To know whether a directory has ACL's notice the + sign on the last bit

$ hdfs dfs -ls /user/toto/ 
Found 1 items 
drwxrwxr-x+ - hdfs hdfs 0 2017-12-12 14:15 /user/toto/acldemo

Hope that helps

View solution in original post

8 REPLIES 8

avatar

@Aditya Sirna Thanks for your prompt reply.
Also i need to add new users on that Amazon Machine. and then change their right etc.

avatar
Super Guru

@Mudassar Hussain,

Yes. You can add users on that machine. Make sure to add the user in all the nodes of the cluster.

avatar

can you please give me some kind of web link which will help me. Thanks

avatar
Super Guru

@Mudassar Hussain,

This link gives usage for ACLs. Configuring ACLs is simple as I mentioned above. Just add that config and restart the services.

To add the user you can run the command

useradd {username}

Use this link for more info

Can you please Accept the original answer if this helps you. This will be really helpful for other community users.

Thanks,

Aditya

avatar

Thanks a lot @Aditya Sirna

avatar

@Aditya Sirna Thanks for your prompt reply.
Also i need to add new users on that Amazon Machine. and then change their right etc.

avatar
Master Mentor

@Mudassar Hussain

Assuming no Kerberos but you want your user to access the HDP cluster ,usually, the local users are on the edgenode. To apply HDFS ACL the local user should have a home in hdfs

Create a local user on edge node here my user toto doesn't belong to any group for demo purposes.

# useradd toto

Before you can implement HDFS acl's you MUST add the below property in hdfs-site.xml or custom-hdfs-site in the namenode the default value is false, then restart the all the stale configs

dfs.namenode.acls.enabled=true

As the HDFS user create a directory acldemo in toto user home in HDFS

$ hdfs dfs -mkdir /user/toto/acldemo

As HDFS user change the ownership

$hdfs dfs -chown toto:hdfs  /user/toto/acldemo

created 3 dummy files and copied then to hdfs

$ hdfs dfs -put test2.txt test3.json test.txt /user/toto/acldemo 

Validate the copy process

$ hdfs dfs -ls /user/toto/acldemo 
-rw-r--r-- 3 hdfs hdfs 0 2017-12-12 13:38 /user/toto/acldemo/test.txt 
-rw-r--r-- 3 hdfs hdfs 0 2017-12-12 13:38 /user/toto/acldemo/test2.txt 
-rw-r--r-- 3 hdfs hdfs 0 2017-12-12 13:38 /user/toto/acldemo/test3.json

Set ACL on the directory acldemo for different users namely toto,hive,kafka to see all the subcommands type hdfs dfs and hit ENTER

user toto has RWX

$ hdfs dfs -setfacl -m user:toto:--- /user/toto/acldemo 

User hive has Read Write

$ hdfs dfs  -setfacl -m user:hive:rwx  /user/toto/acldemo 

User Kafka has only READ

$ hdfs dfs  -setfacl -m user:kafka:r-x  /user/toto/acldemo 

To check the current ACL's

$ hdfs dfs  -getfacl /user/toto/acldemo
# file: /user/toto/acldemo
# owner: toto
# group: hdfs
user::rwx
user:hive:rwx
user:kafka:r-x
user:toto:---
group::r-x
mask::rwx
other::r-x

Now to check whether the permissions work

For user Kafka he can read but NOT copy any files to

[kafka@host]$ hdfs dfs -put kafak.txt  /user/toto/acldemo
put: Permission denied: user=kafka, access=WRITE, inode="/user/toto/acldemo/kafak.txt._COPYING_":toto:hdfs:drwxrwxr-x
[kafka@host ~]$ hdfs dfs -cat  /user/toto/acldemo/test.txt
If you can read me then you have the correct permisions

User toto has no permissions !!

[toto@host]$ hdfs dfs -cat /user/toto/acldemo/test.txt cat: Permission denied: user=toto, access=EXECUTE, inode="/user/toto/acldemo/test.txt":toto:hdfs:drwxrwxr-x

For user hive exit code 0 "success" because it can read the contents of the text.txt file in hdfs

[hive@host]$ hdfs dfs -cat /user/toto/acldemo/test.txt
If you can read me then you have the correct permisions

To know whether a directory has ACL's notice the + sign on the last bit

$ hdfs dfs -ls /user/toto/ 
Found 1 items 
drwxrwxr-x+ - hdfs hdfs 0 2017-12-12 14:15 /user/toto/acldemo

Hope that helps

avatar

Thanks a lot @Geoffrey Shelton Okot for brief answer.
I am sorry i am totally new to this so even i did not know where to write these command. i am using Amazon machine.
I create a new user "toto" at "Node2". but not know how i view the all user at "Node2" including "toto".
and then i go to "NameNode" and view the file "hdfs-site.xml" . but did not found property "dfs.namenode.acls.enabled".
actually i need to know which command write on which "node" in Amazon machine environment.
Thanks again