Member since
06-07-2016
81
Posts
3
Kudos Received
5
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
1431 | 02-21-2018 07:54 AM | |
3680 | 02-21-2018 07:52 AM | |
4690 | 02-14-2018 09:30 AM | |
1994 | 10-13-2016 04:18 AM | |
12320 | 10-11-2016 08:26 AM |
01-16-2018
02:24 AM
@Geoffrey Shelton Okot Thanks will close the thread. Yes the steps are verified multiple times and we end up with that error. We have not subscribed for even hortonworks basic support, because of this risk we have not upgraded. In case we stuck up with some issues there is no one to help. Client is aware of this.
... View more
01-15-2018
03:14 PM
@Geoffrey Shelton Okot Sorry was stuck up in few issues and missed to reply. Yes the steps you have mentioned all followed. I was getting the error which i have shown in my first post. Hence i initialized this thread and you have provided the same steps which i have followed. Not sure what is wrong or some bug ? When ran with user A or B who are part of data_team in ACL. 😞 "$ hadoop fs -ls /abc/month=12 ls: Permission denied: user=A, access=EXECUTE, inode="/abc/month=12":abiuser:dfsusers:drwxrwx---"
... View more
01-11-2018
02:31 AM
@Sandeep Kumar Yes, I have referred those documents already and set as required. Problem is it is not allowing the user to read the file which got a proper permission in ACL. You may go through my initial postings with the steps. Thank you
... View more
01-11-2018
02:29 AM
@Geoffrey Shelton Okot First of all thanks for your time and outputs, samething been done with only one difference. I have given acl permission for the group data_team with r-x instead of individual users. In future there will be a requirement for other users to get only read access which I can do by just adding them to the group data_team in Linux. Hope this also should work. Below is the command I have used. hdfs dfs -setfacl -m -R group:data_team:r-x /abc/month=12 Could you create different file with abiuser as owner and dfsusers as group and add ACL for the group data_team with just read permission? Thank you.
... View more
01-10-2018
09:27 AM
@Geoffrey Shelton Okot 1. ACL feature is enabled by adding the below entry in custom hdfs-site.xml file and restarted the required services from ambari console. <property> <name>dfs.namenode.acls.enabled</name> <value>true</value> </property> 2. I gave sample as A and B user and they have been added to the group data_team (on Linux level), they are not abiuser. abiuser is the owner of the file. dfsusers is the group of that file (/abc/month=12/file1.bcsf). ACL permission added for the group data_team using the below command. hdfs dfs -setfacl -m -R group:data_team:r-x /abc/month=12/ 3. Above setup is done, but still user A and B not able to read or access the files where ACL permission been given.
... View more
01-08-2018
05:11 AM
Dear all, I have enabled ACL on the ambari console and restarted the required services and I'm able to set the permissions for specific group as well. But when they try to execute it is not working. Need your suggestions. My HDP version is 2.4 and hadoop 2.7. getfacl permission on the folder and file is: $ hdfs dfs -getfacl -R /abc/month=12/ # file: /abc/month=12 # owner: abiuser # group: dfsusers user::rwx group::r-x group:data_team:r-- mask::r-x other::--- default:user::rwx default:group::r-x default:group:data_team:r-x default:mask::r-x default:other::--- # file: /abc/month=12/file1.bcsf # owner: abiuser # group: dfsusers user::rwx group::r-- group:data_team:r-- mask::r-- other::--- user A and B are part of data_team, when they try to read the file we are getting the below error. $ hadoop fs -ls /abc/month=12 ls: Permission denied: user=A, access=EXECUTE, inode="/abc/month=12":abiuser:dfsusers:drwxrwx--- Appreciate any suggestion / help? Thank you
... View more
Labels:
11-21-2016
03:15 AM
@Kuldeep Kulkarni Thanks for your comment, we were using hdp 2.4 & hadoop 2.7.2. Not sure this feature is in there with this version. I enabled read only permission for .Trash folder for the users. Also i was having snapshop enabled for a directory, which is not protective enough as we can able to delete the folders inside a snapshot directory. May be upgrade would be the last option to explore and use it. Thank you much for the comments.
... View more
11-18-2016
03:00 AM
Very nice article. If you got step by step procedure with pre-requisites,could you pls fwd to me (muthukumar.siva@gmail.com) i would like to implement in my environment. Thank you in advance.
... View more
11-16-2016
07:21 AM
1 Kudo
Hi All,
Is there a way to prevent a hdfs directory owner to run hadoop rm command?
Like user1 is the owner of the hdfs directory /app/user1/
and got other folders and files under it.
If he runs hadoop fs -rm -R (or just rm) /app/user1 it should fail.
Is there a better way to prevent it?
... View more
Labels:
- Labels:
-
Apache Hadoop
10-13-2016
06:41 AM
@slachterman Above one is for AWS instances as we have been using credentials with the command. For on-prem setup I would need to check. One thing I know is when we setup the onprem servers with AWS CLI installation, we can run aws configure command to provide the credentials once and there on we can run the aws s3 commands from the command line to access AWS S3 (provided we have setup things in AWS end like IAM user creation and bucket policy etc). But with hadoop distcp the one you provided is the solution. May be we can check with AWS guys if there is an option with role based from on-prem.
... View more