Member since
01-19-2017
3676
Posts
632
Kudos Received
372
Solutions
My Accepted Solutions
| Title | Views | Posted |
|---|---|---|
| 616 | 06-04-2025 11:36 PM | |
| 1182 | 03-23-2025 05:23 AM | |
| 585 | 03-17-2025 10:18 AM | |
| 2192 | 03-05-2025 01:34 PM | |
| 1378 | 03-03-2025 01:09 PM |
01-12-2018
09:57 PM
@Mudassar Hussain Prerequisite for question 1,2 and 3 I am assuming you are creating the ACL's from scratch, below are steps to prepare the groups and users Create the 2 groups # groupadd Marketing
# groupadd Account Add the 3 users to Marketing group # useradd -G Marketing Mark1
# useradd -G Marketing Mark2
# useradd -G Marketing Mark3 Add 3 users to Accounting group # useradd -G Account AC1
# useradd -G Account AC2
# useradd -G Account AC3 Answer to question 1 There are 2 variations to get the all memebers of a group in linux the 2 versions of the command are below
grep 'Account' /etc/group
awk -F':' '/Marketing/{print $4}' /etc/group Expected output [root@nakuru ~]# grep 'Account' /etc/group
Account:x:1029:AC1,AC2,AC3
[root@nakuru ~]# awk -F':' '/Marketing/{print $4}' /etc/group
Mark1,Mark2,Mark3 To enable ACL's in HDP you need to set the dfs.namenode.acls.enabled to true using Ambari in custom hdfs-site.xml which is the recommended way. And restart all stale service typicall HDFS,MapReduce,YARN,ATLAS in my case see attached screenshot Answer to question 2 Task Set user "AC1" (group: "Account") to have "Read/Write/Execute" privilege in Group "Marketing". This will entail creating a file in hdfs with owner Mark1or 2 or 3 and group Marketing, as root switch to any user in group Marketing. First create a directory in hdfs and change the ownership to Mark1 and group Marketing As hdfs user created the directory and change ownership and permission # su - hdfs [hdfs@nakuru ~]
$ hdfs dfs -mkdir -p /marketing/acldemo
[hdfs@nakuru ~]$ hdfs dfs -chown -R Mark1:marketing /marketing/acldemo Validate the above commands were successful. [hdfs@nakuru ~]$ hdfs dfs -ls /marketing
Found 1 items drwxr-xr-x - Mark1 marketing 0 2018-01-12 21:54 /marketing/acldemo Get the current ACL [hdfs@nakuru ~]$ hdfs dfs -getfacl -R /marketing/acldemo
# file: /marketing/acldemo
# owner: Mark1
# group: marketing
user::rwx
group::r-x
other::r-x I removed the r-x for other to be sure and revalidate note the others now had no r-x [Mark1@nakuru ~]$ hdfs dfs -chmod 750 /marketing/acldemo
[Mark1@nakuru ~]$ hdfs dfs -getfacl -R /marketing/acldemo
# file: /marketing/acldemo
# owner: Mark1
# group: marketing
user::rwx
group::r-x
other::--- Switch to user Mark1 create a local file and copy to hdfs # su - Mark1
[Mark1@nakuru ~]$ echo "This is Hussain testing ACL ser "AC1" (group: "Account") will have the Right Read/Write/Execute in Group "Marketing"" > test1.txt
[Mark1@nakuru ~]$ ls -al -rw-r--r-- 1 Mark1 Marketing 113 Jan 12 21:51 test1.txt Copy the above file to hdfs in previously created directory and check that it was successfully copied to hdfs [Mark1@nakuru ~]$ hdfs dfs -put test1.txt /marketing/acldemo
[Mark1@nakuru ~]$ hdfs dfs -ls /marketing/acldemo
Found 1 items -rw-r--r-- 3 Mark1 marketing 113 2018-01-12 22:05 /marketing/acldemo/test1.txt Testing Switched to user AC1 in group Account to see if he could read the file, it failed that's normal [root@nakuru ~]# su AC1
[AC1@nakuru root]$ hdfs dfs -cat /marketing/acldemo/test1.txt
cat: Permission denied: user=AC1, access=EXECUTE, inode="/marketing/acldemo/test1.txt":Mark1:marketing:drwxr-x- Change the ACL for user AC1 of group Account to have rwx as you requested [Mark1@nakuru ~]$ hdfs dfs -setfacl -m user:AC1:rwx /marketing/acldemo Check the new ACL,note now the user ACI now has rwx on the file test1 [Mark1@nakuru ~]$ hdfs dfs -getfacl /marketing/acldemo/test1.txt
# file: /marketing/acldemo/test1.txt
# owner: Mark1
# group: marketing
user::rw-
user:AC1:rwx
group::r--
mask::rwx
other::r-- Switch to user AC1 and test that user AC1 can now read the file. [root@nakuru ~]# su AC1
[AC1@nakuru root]$ hdfs dfs -cat /marketing/acldemo/test1.txt
This is Hussain testing ACL ser AC1 (group: Account) will have the Right Read/Write/Execute in Group Marketing SUCCESS ! Answer to question 3 User Mark1 of Marketing should not able to copy the file into "Account" user, create directory and change ownership to any user in Account group [root@nakuru ~]# su - hdfs
[hdfs@nakuru ~]$ hdfs dfs -mkdir -p /Account/acldemo2
[hdfs@nakuru ~]$ hdfs dfs -chown AC1:Account /Account/acldemo2 Get the ACL of newly created directory, note the 3 octets (other is r-x) [root@nakuru ~]# su AC1
[AC1@nakuru root]$ hdfs dfs -getfacl /Account/acldemo2
# file: /Account/acldemo2
# owner: AC1
# group: Account
user::rwx
group::r-x
other::r-x Test with user Mark1 can't copy a file to the directory /Account/acldemo2 from local [root@nakuru ~]# su - Mark1
[Mark1@nakuru ~]$ hdfs dfs -put test1.txt /Account/acldemo2
put: Permission denied: user=Mark1, access=WRITE, inode="/Account/acldemo2/test1.txt._COPYING_":AC1:Account:drwxr-xr-x The above is quite straightforward Mark1 belongs to Marketing and doesn't have any permissions on this directory, I hope that's what you meant? Please if that answers your question then please Accept the answer by Clicking on Accept button below, That would be a great help to Community users to find a solution quickly for these kinds of ACL issues.
... View more
01-11-2018
04:00 PM
@Muthukumar S I have successfully reproduced as you request "created a different file with abiuser as owner and dfsusers as group and add ACL for the group data_team with just read permission?" Created file acltest2.txt as user abiuser see contents [root@nakuru ~]# su - abiuser
[abiuser@nakuru ~]$ vi acltest2.txt
Could you create different file with abiuser as owner and dfsusers as group and add ACL for the group data_team with just read permission?
Thank you. Check the file [abiuser@nakuru ~]$ ls -al
-rw-r--r-- 1 abiuser dfsusers 151 Jan 11 13:00 acltest2.txt Copied the file to hdfs [abiuser@nakuru ~]$ hdfs dfs -put acltest2.txt /abc/month=12 Confirmation of file in HDFS note user and group [abiuser@nakuru ~]$ hdfs dfs -ls /abc/month=12
Found 2 items
-rw-r--r-- 3 abiuser dfsusers 151 2018-01-11 13:00 /abc/month=12/acltest2.txt
-rw-r--r-- 3 abiuser dfsusers 249 2018-01-11 12:38 /abc/month=12/file1.txt Set the ACL for group data_team [readonly] where usera and userb belong [abiuser@nakuru ~]$ hdfs dfs -setfacl -m group:data_team:r-- /abc/month=12/acltest2.txt Changed to usera [root@nakuru ~]# su - usera Successfully read the file as usera [usera@nakuru ~]$ hdfs dfs -cat /abc/month=12/actest2.txt
Could you create different file with abiuser as owner and dfsusers as group and add ACL for the group data_team with just read permission? Thank you. Now lets check the ACL's [usera@nakuru ~]$ hdfs dfs -getfacl -R /abc/month=12/
# file: /abc/month=12
# owner: abiuser
# group: dfsusers
user::rwx
group::r-x
other::r-x
# file: /abc/month=12/acltest2.txt
# owner: abiuser
# group: dfsusers
user::rw-
group::r--
group:data_team:r--
group:dfsusers:r--
mask::r--
other::r--
# file: /abc/month=12/file1.txt
# owner: abiuser
# group: dfsusers
user::rw-
group::r--
other::r-- Hope that answers your issue where did you encounter the problem is there a step you missed? Please accept and close this thread
... View more
01-10-2018
10:36 PM
@Muthukumar S I have tried to reproduce your environment as below. HDP 2.6.2 Ambari 2.5.2, I don't think the version difference is an issue. Created group data_team,dfusers and users abisuer,usera and userb,please try to follow the steps I used to understand and compare with your own. I set the dfs.namenode.acls.enabled to true using Ambari which is the recommended way. Created groups and users [root@nakuru ~]# groupadd data_team
[root@nakuru ~]# useradd -G data_team usera
[root@nakuru ~]# useradd -G data_team userb
[root@nakuru ~]# groupadd dfsusers
[root@nakuru ~]# useradd abiuser Switched to user abiuser belonging to group dfsusers and created a file file1.txt with the below contents [root@nakuru ~]# su - abiuser
[abiuser@nakuru ~]$ vi file1.txt
/*contents*/
I have enabled ACL on the ambari console and restarted the required services and I'm able to set the permissions for specific group as well. But when they try to execute it is not working. Need your suggestions. My HDP version is 2.4 and hadoop 2.7. Checked the saved file [abiuser@nakuru ~]$ ls -al
-rw-r--r-- 1 abiuser abiuser 250 Jan 10 22:24 file1.txt Enabled ACL (custom hdfs-site.xml) through Ambari. dfs.namenode.acls.enabled=true Restart all stale configs in my case HDFS YARN MapReduces2 Atlas As hdfs user created the directory and change ownership and permission [hdfs@nakuru ~]$ hdfs dfs -mkdir -p /abc/month=12
[hdfs@nakuru ~]$ hdfs dfs -chown -R abiuser:dfsusers /abc/month=12 Validate the above [hdfs@nakuru ~]$ hdfs dfs -ls /abc
Found 1 items drwxr-xr-x - abiuser dfsusers 0 2018-01-10 22:40 /abc/month=12 Copy the file1.txt from local to hdfs [abiuser@nakuru ~]$ hdfs dfs -put file1.txt /abc/month=12
[abiuser@nakuru ~]$ hdfs dfs -ls /abc/month=12
Found 1 items -rw-r--r-- 3 abiuser dfsusers 250 2018-01-10 22:46 /abc/month=12/file1.txt Now see the ACL's on the file [abiuser@nakuru ~]$ hdfs dfs -getfacl -R /abc/month=12/
# file: /abc/month=12
# owner: abiuser
# group: dfsusers
user::rwx
group::r-x
other::r-x
# file: /abc/month=12/file1.txt
# owner: abiuser
# group: dfsusers
user::rw-
group::r--
other::r-- Now set the ACL rwx for usera and userb as the file owner abiuser [abiuser@nakuru ~]$ hdfs dfs -setfacl -m user:usera:rwx /abc/month=12/file1.txt
[abiuser@nakuru ~]$ hdfs dfs -setfacl -m user:userb:rwx /abc/month=12/file1.txt Validate the above ACL's for the file1.txt [abiuser@nakuru ~]$ hdfs dfs -getfacl -R /abc/month=12/
# file: /abc/month=12
# owner: abiuser
# group: dfsusers
user::rwx
group::r-x
other::r-x
# file: /abc/month=12/file1.txt
# owner: abiuser
# group: dfsusers
user::rw-
user:usera:rwx
user:userb:rwx
group::r--
mask::rwx
other::r-- See if usera can read the file [root@nakuru ~]# su - usera
[usera@nakuru ~]$ hdfs dfs -cat /abc/month=12/file1.txt
I have enabled ACL on the ambari console and restarted the required services and I'm able to set the permissions for specific group as well. But when they try to execute it is not working. Need your suggestions. My HDP version is 2.4 and hadoop 2.7. I get exactly the contents of the file1.txt See if userb can read the file [root@nakuru ~]# su - userb
[userb@nakuru ~]$ hdfs dfs -cat /abc/month=12/file1.txt
I have enabled ACL on the ambari console and restarted the required services and I'm able to set the permissions for specific group as well. But when they try to execute it is not working. Need your suggestions. My HDP version is 2.4 and hadoop 2.7. Voila , that answers your question the file owner and group is abiuser:dfsuser but usera and userb from a different group data_team can successfully read the file1.txt Could you Accept the answer by Clicking on Accept button below, if this answers your problem that would be great help to Community users to find solution quickly.
... View more
01-08-2018
08:54 AM
@Muthukumar S Please, could you explain the steps you did? To reproduce your scenario can you elaborate is user A and B =abiuser? what is the relation between dfsusers and data_team. I have implemented numerous variations of permissions and I don't see why this shouldn't work
... View more
01-07-2018
09:50 PM
@j van Arendonk can you check the directory owner and group $ ls -al /opt/lucidworks-hdpsearch The solr user should also show up in $ cat /etc/passwd revert
... View more
01-06-2018
06:32 PM
@j van Arendonk Did you add the SOLR service? That should have created the solr user owns the /opt/lucidworks-hdpsearch Please validate that part.
... View more
12-22-2017
10:00 PM
@chandramouli muthukumaran As reminded by @Ashnee Sharma If a reponse resolved your problem, could you Accept the answer by Clicking on Accept button below, That would be a great help to Community users to find a solution quickly for these kinds of errors and also reward the members who strive to keep this forum active.
... View more
12-22-2017
09:58 PM
@Miro Ka @Gaurav Bapat You are responding to an old thread (July 2017) and I am pretty sure members are ignoring. It's advisable to open a new thread could you reformat your questions in a new thread. If I get a notification I will try to help .
... View more
12-20-2017
07:30 PM
3 Kudos
@Michael Bronson Your name node is in safe mode you, to confirm do this as root user # su - hdfs As hdfs $ hdfs dfsadmin -safemode get From the above you will confirm the status in the error "safe mode" then $ hdfs dfsadmin -safemode leave Then retry
... View more
12-19-2017
09:39 PM
@yothi k Can you open a new thread the former was closed and was specific to blobs! Please explain your situation and remember to give a description of your environment HDP/Ambari versions etc. You may also incluse your sqoop script and log for the failed job.
... View more