Archives of Support Questions (Read Only)

This is an archived board for historical reference. Information and links may no longer be available or relevant
Announcements
This board is archived and read-only for historical reference. To ask a new question, please post a new topic on the appropriate active board.

Create new hive user

avatar
New Member

I want to create new user for hive. When I installed hive using Ambari I created default hive user but want to create one more user. I know it has to be through HDFS but can some one give me exact commands to run and where should I create that. All hive documentation doesn't give me that option.

1 ACCEPTED SOLUTION

avatar
Master Guru

By itself any HDFS user can use Hive.

If you do not have kerberos:

1) Just add the user to Linux with useradd ( or add to your ldap )

2) create user home directory

as user hdfs do

hadoop fs -mkdir /home/myuser

hadoop fs-chown myuser /home/myuser

And that's it you should now be able to run queries as that user.

Now there is a second part for secure clusters

If you have kerberos:

1) Add user to your KDC

2) kinit as the user

3) run hive

Using hiveserver2 with beeline or jdbc driver:

Depends on which security you have configured for hiveserver2

- None, just specify myuser as user ( -n myuser )

- PAM uses Linux so give -n myuser -p myuserlinuxpassword

- LDAP uses an LDAP server lets assume its the same as your linux user

- kerberos needs a kinit and specify the principal in the jdbc url

Finally your user needs access to the tables:

No authorization

- make sure the tables you want to read are readable by that user or enable doAs=false

Ranger

- Add access to the table in Ranger portal

SQLStdAuth

- Grant access to the table using GRANT command

View solution in original post

3 REPLIES 3

avatar
Master Guru

By itself any HDFS user can use Hive.

If you do not have kerberos:

1) Just add the user to Linux with useradd ( or add to your ldap )

2) create user home directory

as user hdfs do

hadoop fs -mkdir /home/myuser

hadoop fs-chown myuser /home/myuser

And that's it you should now be able to run queries as that user.

Now there is a second part for secure clusters

If you have kerberos:

1) Add user to your KDC

2) kinit as the user

3) run hive

Using hiveserver2 with beeline or jdbc driver:

Depends on which security you have configured for hiveserver2

- None, just specify myuser as user ( -n myuser )

- PAM uses Linux so give -n myuser -p myuserlinuxpassword

- LDAP uses an LDAP server lets assume its the same as your linux user

- kerberos needs a kinit and specify the principal in the jdbc url

Finally your user needs access to the tables:

No authorization

- make sure the tables you want to read are readable by that user or enable doAs=false

Ranger

- Add access to the table in Ranger portal

SQLStdAuth

- Grant access to the table using GRANT command

avatar
Expert Contributor
@Benjamin Leonhardi

if I use SQL Authentication using this method, how should I assign passwords to users? For using in DB connection for instance...

Will Hive consider OS level user passwords? If so, should I set a password also for 'hive' user? Does it affect other operations?

avatar
New Member

@bleonhardi, How can I connect to Hive with a UNIX/HDFS user?

Let's say I have a user named "newuser". I tried the following, but apparently I am doing something wrong.

% su - newuser
% beeline
beeline> !connect -n newuser -p jdbc:mysql://my.host.com/hive