Created 01-20-2017 11:59 AM
I would like to create an user and put a Ranger policy on it to restricts a HDFS directory and I also would like to test it in " File View" from Amabari.
Do I need to create user on both Ambari and ranger with same name/same passwd ?
am unable to "su" to the new user ( created via amabari ) in unix cmd prompt. I am using HDP2.5
Created 01-20-2017 12:38 PM
@Dinesh Das In Ambari you can add users and groups manually - click on the 'admin' button at the top right, select 'Manage Ambari' and then click on either Users or Groups and then the 'Create Local User/Group' button. These users only exist in Ambari, not in the OS or in Ranger.
Alternatively you can configure Ambari to pull users and groups from LDAP/Active Directory - see Configuring Ambari for LDAP or Active Directory Authentication
If you want to be able to 'su' to the user in the OS then you'll need to configure your OS to also read the users from LDAP/Active Directory or manually add them to your OS using 'adduser' and 'addgroup'.
Ranger can synchronize your users and groups either from the OS or from LDAP/Active Directory - see Advanced Usersync Settings.
The best choice is to sync all three - OS, Ambari and Ranger - from LDAP/Active Directory. That way you ensure that all users and groups exist in all three components.
Created 01-20-2017 12:38 PM
@Dinesh Das In Ambari you can add users and groups manually - click on the 'admin' button at the top right, select 'Manage Ambari' and then click on either Users or Groups and then the 'Create Local User/Group' button. These users only exist in Ambari, not in the OS or in Ranger.
Alternatively you can configure Ambari to pull users and groups from LDAP/Active Directory - see Configuring Ambari for LDAP or Active Directory Authentication
If you want to be able to 'su' to the user in the OS then you'll need to configure your OS to also read the users from LDAP/Active Directory or manually add them to your OS using 'adduser' and 'addgroup'.
Ranger can synchronize your users and groups either from the OS or from LDAP/Active Directory - see Advanced Usersync Settings.
The best choice is to sync all three - OS, Ambari and Ranger - from LDAP/Active Directory. That way you ensure that all users and groups exist in all three components.
Created 01-20-2017 01:06 PM
@Terry Stebbens Thank you so much for this details explanations. I am using HDP sandbox 2.5 and there LDAP is not setup with Ambari/OS. Am able to sync betn Ranger/Ambari. Is there anything can be done to tweak the configuration of sandbox 2.5 to use LDAP.
Created on 01-20-2017 07:24 PM - edited 08-19-2019 01:40 AM
@Dinesh Das The HDP Sandbox 2.5 comes with Knox that includes a demo LDAP server which should be sufficient for testing purposes. You can start and stop this from Ambari under Knox > Service Actions.
In the Knox configuration is a section called 'Advanced users-ldif' which contains the LDIF data loaded by the demo LDAP server. You can add users and groups to this LDIF, save the configuration and then restart the demo LDAP server. If you're not familiar with LDIF then the template to add a user is something like:
dn: uid=<username>,ou=people,dc=hadoop,dc=apache,dc=org objectclass: top objectclass: person objectclass: organizationalPerson objectclass: inetOrgPerson cn: <common name, e.g. Joe Bloggs> sn: <surname, e.g. Bloggs> uid: <username> userPassword: <password>
Replace <username> with the username you want to add, <common name, e.g. Joe Bloggs> with the full name of the user, <surname, e.g. Bloggs> with the surname of the user, and <password> with the password you want.
Similarly for groups, use the template
dn: cn=<groupname>,ou=groups,dc=hadoop,dc=apache,dc=org objectclass:top objectclass: groupofnames cn: <groupname> member: uid=<username>,ou=people,dc=hadoop,dc=apache,dc=org
Replace <groupname> with the group name you want and add and as many of the member: lines as you want to add users to the group, e.g.
member: uid=user_a,ou=people,dc=hadoop,dc=apache,dc=org member: uid=user_b,ou=people,dc=hadoop,dc=apache,dc=org member: uid=user_c,ou=people,dc=hadoop,dc=apache,dc=org
Configuring your OS to read these users and groups from the demo LDAP server is quite complex - you'll need a lot more information in the LDIF file to support this and to configure PAM/NSS to talk to the LDAP server so for your purposes I'd stick to using 'adduser' and 'addgroup' to add all the users and groups you want to the OS manually.
Once you've added the users and groups you want and started the demo LDAP you can use the instructions here to connect Ambari up with the demo LDAP server: https://community.hortonworks.com/questions/2838/has-anyone-integrated-for-demo-purposes-only-the-k....
For Ranger, you should also leave it syncing users from the OS (the default configuration) as you will have used 'adduser' and 'addgroup' to add all the users to the OS so Ranger will automatically sync these for you. If you really want to sync the users from the demo LDAP server then you'll need the set the following properties for Ranger Admin and Ranger Usersync. Note that I haven't tried this so it may not work and you may need to experiment with some of the settings.
Ranger:
ranger.ldap.base.dn=dc=hadoop,dc=apache,dc=org ranger.ldap.bind.dn=uid=admin,ou=people,dc=hadoop,dc=apache,dc=org ranger.ldap.bind.password=admin-password ranger.ldap.group.roleattribute=cn ranger.ldap.group.searchbase=ou=groups,dc=hadoop,dc=apache,dc=org ranger.ldap.group.searchfilter=(member=uid={0},ou=people,dc=hadoop,dc=apache,dc=org) ranger.ldap.referral=follow ranger.ldap.url=ldap://localhost:33389 ranger.ldap.user.dnpattern=uid={0},ou=people,dc=hadoop,dc=apache,dc=org
ranger.ldap.user.searchfilter=(uid={0})
UserSync:
ranger.usersync.group.memberattributename=member ranger.usersync.group.nameattribute=cn ranger.usersync.group.objectclass=groupofnames ranger.usersync.group.search.first.enabled=false ranger.usersync.group.searchbase=ou=groups,dc=hadoop,dc=apache,dc=org ranger.usersync.group.searchenabled=true ranger.usersync.group.searchfilter= ranger.usersync.group.searchscope=sub ranger.usersync.group.usermapsyncenabled=true ranger.usersync.ldap.binddn=uid=admin,ou=people,dc=hadoop,dc=apache,dc=org ranger.usersync.ldap.groupname.caseconversion=none ranger.usersync.ldap.ldapbindpassword=admin-password ranger.usersync.ldap.referral=follow ranger.usersync.ldap.searchBase=dc=hadoop,dc=apache,dc=org ranger.usersync.ldap.url=ldap://localhost:33389 ranger.usersync.ldap.user.groupnameattribute=memberof,ismemberof ranger.usersync.ldap.user.nameattribute=uid ranger.usersync.ldap.user.objectclass=person ranger.usersync.ldap.user.searchbase=ou=people,dc=hadoop,dc=apache,dc=org ranger.usersync.ldap.user.searchfilter= ranger.usersync.ldap.user.searchscope=sub ranger.usersync.ldap.username.caseconversion=none
Created 01-24-2017 02:29 PM
Thanksa lot again for all ur help .
In HDP sandbox 2.5 , while testing ranger functionalities, I got I need hive user access.but am not sure what is default hive user credentials. Can yo help me in this please.
I am trying to achieve
Having a federated authorization model may create a challenge for security administrators looking to plan a security model for HDFS.
After Apache Ranger and Hadoop have been installed, we recommend administrators to implement the following steps:
Identify the directories that can be managed by Ranger policies
We recommend that permission for application data folders (/apps/hive, /apps/Hbase) as well as any custom data folders be managed through Apache Ranger. The HDFS native permissions for these directories need to be restrictive. This can be done through changing permissions in HDFS using chmod.
Example:
$ hdfs dfs -chmod -R 000 /apps/hive
$ hdfs dfs -chown -R hdfs:hdfs /apps/hive
$ hdfs dfs -ls /apps/hive
Found 1 items
d——— – hdfs hdfs 0 2015-11-30 08:01 /apps/hive/warehouse
After changing umask value 077. Its not allow me to do any operation as root user. its looking for hive user. can u guide me please
Error: at org.apache.hadoop.fs.shell.Command.run(Command.java:165) at org.apache.hadoop.fs.FsShell.run(FsShell.java:297) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:90) at org.apache.hadoop.fs.FsShell.main(FsShell.java:350) chmod: changing permissions of '/apps/hive': Permission denied. user=root is not the owner of inode=hive
http://hortonworks.com/blog/best-practices-in-hdfs-authorization-with-apache-ranger/