Support Questions

Find answers, ask questions, and share your expertise

User Management Question

avatar
Expert Contributor

Trying to explore the best way to manage users in hadoop ecosystem. Basically I am going to provide 3 user interface to user community:

a.) EdgeNode - Linux machine. This is where users can use their Linux credentials and use command line to use hadoop clients (spark, sqoop, hdfs etc)

b.) Ambari web interface

c.) HUE interface

d.) Ranger - For Admins to control file and folder permissions

Question I have is is it possible to create account in Linux environment and let all other pull it from there and use the same credentials. I read about LDAP but it appears to be difficult and we don't currently have a working LDAP.

How can I centrally manage users without using LDAP ?

Thanks

Prakash

1 ACCEPTED SOLUTION

avatar
Super Guru

If you don't use LDAP then linux accounts you use must be present on each node in the cluster and must have similar permissions (you can get away with this requirement if you are using Ranger and disable posix permissions using dfs.permissions.enabled = false).

But without LDAP you need to have linux accounts on all machines.

View solution in original post

6 REPLIES 6

avatar
Super Guru

If you don't use LDAP then linux accounts you use must be present on each node in the cluster and must have similar permissions (you can get away with this requirement if you are using Ranger and disable posix permissions using dfs.permissions.enabled = false).

But without LDAP you need to have linux accounts on all machines.

avatar
Expert Contributor

How about web interface, AMbari and HUE. Can user accounts on Ambari and Hue gets sync with Linux account.

Also why do I need Linux account on all members of the cluster. I can just give access to one machine which can be used as EdgeNode.

Thanks

avatar
Super Guru

The user needs to exist on machines where ever you are reading file blocks from based on Posix permissions. Like I said, you might not need them if you are using Ranger and/or dfs.permissions.enabled = false in core-site.xml. When you are in HUE and run a hive query, it runs as Hive user, not as HUE. You want to make sure, you have a user named "hive" where you have HiveServer2. Then you enable hive impersonation to decide who you want to give what access.

https://docs.hortonworks.com/HDPDocuments/Ambari-2.1.0.0/bk_ambari_views_guide/content/_configuring_...

http://hortonworks.com/blog/best-practices-for-hive-authorization-using-apache-ranger-in-hdp-2-2/

Ambari just a management tool, so you can have Ambari accounts for people who need access to Ambari and this would be independent of cluster. see this link to create "local" users for Ambari.

avatar

For user management use LDAP, so it will be easy to manage one LDAP server otherwise you have to create user on all machines. So rather than creating user on all machine use LDAP and use all machine as LDAP client.

avatar
Expert Contributor

Using LDAP is the plan but making it work doesnt seem to be simple.

Thanks

pP

avatar

To configure ldap is easy. And if you agree with the solution, let's close this.