Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

Adding a user for kerberized cluster

avatar

Hi, i have created a sandbox based on HDP 2.6 and i kerberized it. all the tests and services are successfully running. i have my root user with admin privileges for kerberos. i was getting this error -

WARN ipc.Client: Exception encountered while connecting to the server : javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]
ls: Failed on local exception: java.io.IOException: javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]; Host Details : local host is: "localhost.localdomain/127.0.0.1"; destination host is: "localhost.localdomain":8020; 

when i was trying the hadoop command (hadoop fs -ls /)

but when i ran the command - kinit root/admin@HADOOP.COM i was able to execute the command successfully. Now i wanted to create a new user named as 'hduser' for the i followed these commands -

kadmin.local -q "addprinc hduser"

kinit root/admin@HADOOP.COM

su hduser

now when i tried the 'hadoop fs -ls /' command, i got the the error again. am i missing any step? is it not the right way to add user in a kerberized cluster?

1 ACCEPTED SOLUTION

avatar
Rising Star

Hi @Rishabh Oberoi

The Kerberos principal and the OS user don't have much in common. Each OS user can authenticate as multiple Kerberos principals. The Kerberos principal is stored in a file called the "ticket cache". You can see which principal you are at the moment using the "klist" command. Just type "klist". In this example I am authenticated as "jimmy.page" in the Kerberos REALM "FIELD.HORTONWORKS.COM".

$ klist
Ticket cache: FILE:/tmp/krb5cc_1960402946
Default principal: jimmy.page@FIELD.HORTONWORKS.COM


Valid starting       Expires              Service principal
08/06/2017 14:47:12  08/07/2017 00:47:12  krbtgt/FIELD.HORTONWORKS.COM@FIELD.HORTONWORKS.COM
	renew until 08/13/2017 14:47:12

Without kinit you shouldn't have a ticket in the ticket cache and therfore see something like

$ klist
klist: No credentials cache found (filename: /tmp/krb5cc_1960402946)

Before you do any "hadoop" or "hdfs" commands you should check with klist if you are authenticated and if you are authenticated as the user you want to be.

Thus, independently from which OS user you are, you can authenticate as the hduser by simply doing

kinit hduser

You will be promted for the password of the hduser.

Now you should be able to use HDFS as hduser.

Note 1: Be prepared, that you will not have any permissions to create directories or write data unless you give these permission using the HDFS internal POSIX system or setting a corresponding policy in Apache Ranger.

Note 2: If you use keytabs instead of passwords (and for the sake of clarity) it makes sense to create an OS user AND a Kerberos principal with the same name and give the OS user permissions on the keytab to that user only.

View solution in original post

3 REPLIES 3

avatar
Rising Star

Hi @Rishabh Oberoi

The Kerberos principal and the OS user don't have much in common. Each OS user can authenticate as multiple Kerberos principals. The Kerberos principal is stored in a file called the "ticket cache". You can see which principal you are at the moment using the "klist" command. Just type "klist". In this example I am authenticated as "jimmy.page" in the Kerberos REALM "FIELD.HORTONWORKS.COM".

$ klist
Ticket cache: FILE:/tmp/krb5cc_1960402946
Default principal: jimmy.page@FIELD.HORTONWORKS.COM


Valid starting       Expires              Service principal
08/06/2017 14:47:12  08/07/2017 00:47:12  krbtgt/FIELD.HORTONWORKS.COM@FIELD.HORTONWORKS.COM
	renew until 08/13/2017 14:47:12

Without kinit you shouldn't have a ticket in the ticket cache and therfore see something like

$ klist
klist: No credentials cache found (filename: /tmp/krb5cc_1960402946)

Before you do any "hadoop" or "hdfs" commands you should check with klist if you are authenticated and if you are authenticated as the user you want to be.

Thus, independently from which OS user you are, you can authenticate as the hduser by simply doing

kinit hduser

You will be promted for the password of the hduser.

Now you should be able to use HDFS as hduser.

Note 1: Be prepared, that you will not have any permissions to create directories or write data unless you give these permission using the HDFS internal POSIX system or setting a corresponding policy in Apache Ranger.

Note 2: If you use keytabs instead of passwords (and for the sake of clarity) it makes sense to create an OS user AND a Kerberos principal with the same name and give the OS user permissions on the keytab to that user only.

avatar

Hi, i added hduser principal and ran kinit hduser. then i went to hduser and ran the command hadoop fs -ls / still the same error

avatar

problem solved. just had to kinit the hduser while in hduser