Created on 08-22-2019 03:06 AM - last edited on 08-22-2019 11:42 AM by cjervis
Hi,
We are unable to access HDFS from a particular user 'edtuser' from putty after enabling kerberos. But we are able to access HDFS from 'root' user and 'hdfs' user. When we are trying to access HDFS from 'edtuser', the below error is coming:-
[edtuser@nladfmrvu11 ~]$
[edtuser@nladfmrvu11 ~]$ hadoop fs -ls /
19/08/22 09:48:06 WARN ipc.Client: Exception encountered while connecting to the server : org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]
19/08/22 09:48:06 WARN ipc.Client: Exception encountered while connecting to the server : org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]
19/08/22 09:48:06 INFO retry.RetryInvocationHandler: java.io.IOException: DestHost:destPort nladfmrvu12.mtn.co.za:8020 , LocalHost:localPort nladfmrvu11.mtn.co.za/10.244.8.52:0. Failed on local exception: java.io.IOException: org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS], while invoking ClientNamenodeProtocolTranslatorPB.getFileInfo over nladfmrvu12.mtn.co.za/10.244.8.53:8020 after 1 failover attempts. Trying to failover after sleeping for 610ms.
19/08/22 09:48:06 WARN ipc.Client: Exception encountered while connecting to the server : org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]
19/08/22 09:48:06 INFO retry.RetryInvocationHandler: java.io.IOException: DestHost:destPort nladfmrvu11.mtn.co.za:8020 , LocalHost:localPort nladfmrvu11.mtn.co.za/10.244.8.52:0. Failed on local exception: java.io.IOException: org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS], while invoking ClientNamenodeProtocolTranslatorPB.getFileInfo over nladfmrvu11.mtn.co.za/10.244.8.52:8020 after 2 failover attempts. Trying to failover after sleeping for 1680ms.
19/08/22 09:48:08 WARN ipc.Client: Exception encountered while connecting to the server : org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]
19/08/22 09:48:08 INFO retry.RetryInvocationHandler: java.io.IOException: DestHost:destPort nladfmrvu12.mtn.co.za:8020 , LocalHost:localPort nladfmrvu11.mtn.co.za/10.244.8.52:0. Failed on local exception: java.io.IOException: org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS], while invoking ClientNamenodeProtocolTranslatorPB.getFileInfo over nladfmrvu12.mtn.co.za/10.244.8.53:8020 after 3 failover attempts. Trying to failover after sleeping for 5206ms.
19/08/22 09:48:13 WARN ipc.Client: Exception encountered while connecting to the server : org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]
19/08/22 09:48:13 INFO retry.RetryInvocationHandler: java.io.IOException: DestHost:destPort nladfmrvu11.mtn.co.za:8020 , LocalHost:localPort nladfmrvu11.mtn.co.za/10.244.8.52:0. Failed on local exception: java.io.IOException: org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS], while invoking ClientNamenodeProtocolTranslatorPB.getFileInfo over nladfmrvu11.mtn.co.za/10.244.8.52:8020 after 4 failover attempts. Trying to failover after sleeping for 10481ms.
19/08/22 09:48:24 WARN ipc.Client: Exception encountered while connecting to the server : org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]
19/08/22 09:48:24 INFO retry.RetryInvocationHandler: java.io.IOException: DestHost:destPort nladfmrvu12.mtn.co.za:8020 , LocalHost:localPort nladfmrvu11.mtn.co.za/10.244.8.52:0. Failed on local exception: java.io.IOException: org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS], while invoking ClientNamenodeProtocolTranslatorPB.getFileInfo over nladfmrvu12.mtn.co.za/10.244.8.53:8020 after 5 failover attempts. Trying to failover after sleeping for 8347ms.
19/08/22 09:48:32 WARN ipc.Client: Exception encountered while connecting to the server : org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]
19/08/22 09:48:32 INFO retry.RetryInvocationHandler: java.io.IOException: DestHost:destPort nladfmrvu11.mtn.co.za:8020 , LocalHost:localPort nladfmrvu11.mtn.co.za/10.244.8.52:0. Failed on local exception: java.io.IOException: org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS], while invoking ClientNamenodeProtocolTranslatorPB.getFileInfo over nladfmrvu11.mtn.co.za/10.244.8.52:8020 after 6 failover attempts. Trying to failover after sleeping for 13318ms.
19/08/22 09:48:45 WARN ipc.Client: Exception encountered while connecting to the server : org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]
19/08/22 09:48:45 INFO retry.RetryInvocationHandler: java.io.IOException: DestHost:destPort nladfmrvu12.mtn.co.za:8020 , LocalHost:localPort nladfmrvu11.mtn.co.za/10.244.8.52:0. Failed on local exception: java.io.IOException: org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS], while invoking ClientNamenodeProtocolTranslatorPB.getFileInfo over nladfmrvu12.mtn.co.za/10.244.8.53:8020 after 7 failover attempts. Trying to failover after sleeping for 14341ms.
19/08/22 09:49:00 WARN ipc.Client: Exception encountered while connecting to the server : org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]
19/08/22 09:49:00 INFO retry.RetryInvocationHandler: java.io.IOException: DestHost:destPort nladfmrvu11.mtn.co.za:8020 , LocalHost:localPort nladfmrvu11.mtn.co.za/10.244.8.52:0. Failed on local exception: java.io.IOException: org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS], while invoking ClientNamenodeProtocolTranslatorPB.getFileInfo over nladfmrvu11.mtn.co.za/10.244.8.52:8020 after 8 failover attempts. Trying to failover after sleeping for 17707ms.
19/08/22 09:49:17 WARN ipc.Client: Exception encountered while connecting to the server : org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]
19/08/22 09:49:17 INFO retry.RetryInvocationHandler: java.io.IOException: DestHost:destPort nladfmrvu12.mtn.co.za:8020 , LocalHost:localPort nladfmrvu11.mtn.co.za/10.244.8.52:0. Failed on local exception: java.io.IOException: org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS], while invoking ClientNamenodeProtocolTranslatorPB.getFileInfo over nladfmrvu12.mtn.co.za/10.244.8.53:8020 after 9 failover attempts. Trying to failover after sleeping for 12086ms.
19/08/22 09:49:29 WARN ipc.Client: Exception encountered while connecting to the server : org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]
19/08/22 09:49:29 INFO retry.RetryInvocationHandler: java.io.IOException: DestHost:destPort nladfmrvu11.mtn.co.za:8020 , LocalHost:localPort nladfmrvu11.mtn.co.za/10.244.8.52:0. Failed on local exception: java.io.IOException: org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS], while invoking ClientNamenodeProtocolTranslatorPB.getFileInfo over nladfmrvu11.mtn.co.za/10.244.8.52:8020 after 10 failover attempts. Trying to failover after sleeping for 11451ms.
19/08/22 09:49:41 WARN ipc.Client: Exception encountered while connecting to the server : org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]
19/08/22 09:49:41 INFO retry.RetryInvocationHandler: java.io.IOException: DestHost:destPort nladfmrvu12.mtn.co.za:8020 , LocalHost:localPort nladfmrvu11.mtn.co.za/10.244.8.52:0. Failed on local exception: java.io.IOException: org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS], while invoking ClientNamenodeProtocolTranslatorPB.getFileInfo over nladfmrvu12.mtn.co.za/10.244.8.53:8020 after 11 failover attempts. Trying to failover after sleeping for 22482ms.
19/08/22 09:50:03 WARN ipc.Client: Exception encountered while connecting to the server : org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]
19/08/22 09:50:03 INFO retry.RetryInvocationHandler: java.io.IOException: DestHost:destPort nladfmrvu11.mtn.co.za:8020 , LocalHost:localPort nladfmrvu11.mtn.co.za/10.244.8.52:0. Failed on local exception: java.io.IOException: org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS], while invoking ClientNamenodeProtocolTranslatorPB.getFileInfo over nladfmrvu11.mtn.co.za/10.244.8.52:8020 after 12 failover attempts. Trying to failover after sleeping for 22157ms.
19/08/22 09:50:25 WARN ipc.Client: Exception encountered while connecting to the server : org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]
19/08/22 09:50:25 INFO retry.RetryInvocationHandler: java.io.IOException: DestHost:destPort nladfmrvu12.mtn.co.za:8020 , LocalHost:localPort nladfmrvu11.mtn.co.za/10.244.8.52:0. Failed on local exception: java.io.IOException: org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS], while invoking ClientNamenodeProtocolTranslatorPB.getFileInfo over nladfmrvu12.mtn.co.za/10.244.8.53:8020 after 13 failover attempts. Trying to failover after sleeping for 20978ms.
19/08/22 09:50:46 WARN ipc.Client: Exception encountered while connecting to the server : org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]
19/08/22 09:50:46 INFO retry.RetryInvocationHandler: java.io.IOException: DestHost:destPort nladfmrvu11.mtn.co.za:8020 , LocalHost:localPort nladfmrvu11.mtn.co.za/10.244.8.52:0. Failed on local exception: java.io.IOException: org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS], while invoking ClientNamenodeProtocolTranslatorPB.getFileInfo over nladfmrvu11.mtn.co.za/10.244.8.52:8020 after 14 failover attempts. Trying to failover after sleeping for 16712ms.
^Z
[1]+ Stopped hadoop fs -ls /
[edtuser@nladfmrvu11 ~]$
Created 08-22-2019 06:32 AM
log in as the user and kinit to re- establish validity
Created 08-22-2019 04:36 PM
Please make sure that you have a valid kerberos ticket before running a hdfs command.
You can get a valid kerberos ticket as following:
1). Get the principal name from the keytab:
Example:
# klist -kte /etc/security/keytabs/hdfs.headless.keytab
Keytab name: FILE:/etc/security/keytabs/hdfs.headless.keytab
KVNO Timestamp Principal
---- ------------------- ------------------------------------------------------
2 08/11/2019 01:58:27 hdfs-ker1latest@EXAMPLE.COM (des-cbc-md5)
2 08/11/2019 01:58:27 hdfs-ker1latest@EXAMPLE.COM (aes256-cts-hmac-sha1-96)
2 08/11/2019 01:58:27 hdfs-ker1latest@EXAMPLE.COM (des3-cbc-sha1)
2 08/11/2019 01:58:27 hdfs-ker1latest@EXAMPLE.COM (arcfour-hmac)
2 08/11/2019 01:58:27 hdfs-ker1latest@EXAMPLE.COM (aes128-cts-hmac-sha1-96)
2). Get a valid kerberos ticke t as following. Please not that in the following command your Principal name might be different based on your cluster. So please change the principal name according to the output that you received from above command.
# kinit -kt /etc/security/keytabs/hdfs.headless.keytab hdfs-ker1latest@EXAMPLE.COM
# klist
Ticket cache: FILE:/tmp/krb5cc_0
Default principal: hdfs-ker1latest@EXAMPLE.COM
Valid starting Expires Service principal
08/22/2019 22:47:43 08/23/2019 22:47:43 krbtgt/EXAMPLE.COM@EXAMPLE.COM
3). Now try to run the same HDFS command. This time you should be able to run those commands successfully.
# hadoop fs -ls /
*NOTE:* In the above case we are using "/etc/security/keytabs/hdfs.headless.keytab" in your case you can have your own a valid keytab that allows you to interact with HDFS then you should use that one. For testing you can use the hdfs.headless.keytab.
Created on 08-25-2019 10:23 AM - edited 08-25-2019 10:28 AM
In reality a user shouldn't be able to execute or kinit with the hdfs keytab but have a keytab created for the specific user and when need be deleted when the user is disabled on the cluster typically this user setup happens on the edge node where the hadoop client software are installed and is the recommended setup for giving users access to the cluster.
Below is a demo of the user konar when he attempts to access services in a kerberized cluster
# su - konar
[konar@simba ~]$ id
uid=1024(konar) gid=1024(konar) groups=1024(konar)
Now try to list the directories in HDFS
[konar@simba ~]$ hdfs dfs -ls /
Error
19/08/24 23:59:25 WARN ipc.Client: Exception encountered while connecting to the server : javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]
ls: Failed on local exception: java.io.IOException: javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]; Host Details : local host is: "simba.kenya.ke/192.168.0.87"; destination host is: "simba.kenya.ke":8020;
Below is the desired output when the user konar attempts to use the hdfs headless keytab,
[konar@simba ~]$ kinit -kt /etc/security/keytabs/hdfs.headless.keytab hdfs-jair@KENYA.KE
kinit: Permission denied while getting initial credentials
To enable a user to access the cluster, on the Kerberos server as the root (Keberos admin) do the following steps,
Assumption Realm is KENYA.KE and KDC host is simba and you have root access on the KDC.
Create the admin principle for user konar
[root@simba ~]# kadmin.local
Authenticating as principal root/admin@KENYA.KE with password.
kadmin.local: addprinc konar@KENYA.KE
WARNING: no policy specified for konar@KENYA.KE; defaulting to no policy
Enter password for principal "konar@KENYA.KE":
Re-enter password for principal "konar@KENYA.KE":
Principal "konar@KENYA.KE" created.
kadmin.local: q
Validate the principal was created using the subcommand listprincs [List principals] and limiting the output by restricting to konar classic Unix stuff
[root@simba ~]# kadmin.local
Authenticating as principal root/admin@KENYA.KE with password.
kadmin.local: listprincs *konar
konar@KENYA.KE
Type q [quit] to exit the kadmin utility
Generate the keytab
Generate keytab for user konar using the ktutil, it's good to change to /tmp or whatever you choose so you know the location of the generated keytab your encryption Algorithm could be different but this should work
[root@simba tmp]# ktutil
ktutil: addent -password -p konar@KENYA.KE -k 1 -e RC4-HMAC
Password for konar@KENYA.KE:
ktutil: wkt konar.keytab
ktutil: q
Validate the keytab creation
Check the keytab was generated in the current directory, notice the file permissions!!
[root@simba tmp]# ls -lrt
-rw------- 1 root root 58 Aug 25 18:22 konar.keytab
As root copy the generate keytab to the home directory of user konar typically on the edge node
[root@simba tmp]# cp konar.keytab /home/konar/
Change to konar's home dir and vaildate the copy was successful
[root@simba tmp]# cd /home/konar/
[root@simba konar]# ll
total 4
-rw------- 1 root root 58 Aug 25 18:28 konar.keytab
Change file ownership
Change the file permission on the konar.keytab so that user konar has the appropriate permissions.
[root@simba konar]# chown konar:konar konar.keytab
[root@simba konar]# ll
total 4
-rw------- 1 konar konar 58 Aug 25 18:28 konar.keytab
Switch to user konar and validate that the user has can't still access to hdfs
$ hdfs dfs -ls /
[konar@simba ~]$ hdfs dfs -ls /
Output
19/08/25 18:36:44 WARN ipc.Client: Exception encountered while connecting to the server : javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]
ls: Failed on local exception: java.io.IOException: javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]; Host Details : local host is: "simba.kenya.ke/192.168.0.87"; destination host is: "simba.kenya.ke":8020;
The kerberos klist also confirms that
[konar@simba ~]$ klist
klist: No credentials cache found (filename: /tmp/krb5cc_1024)
As user Konar now try to kinit with the correct principal, the first step is to identify the correct principal
[konar@simba ~]$ klist -kt konar.keytab
Keytab name: FILE:konar.keytab
KVNO Timestamp Principal
---- ------------------- ------------------------------------------------------
1 08/25/2019 18:22:34 konar@KENYA.KE
The above shows the konar user keytab is valid with the principal in the output
Now user konar can grab a valid ticket ûsing the below snippet concatenating the keytab + principal
[konar@simba ~]$ kinit -kt konar.keytab konar@KENYA.KE
The above should throw any error
Now validate the user has a valid ticket
[konar@simba ~]$ klist
Ticket cache: FILE:/tmp/krb5cc_1024
Default principal: konar@KENYA.KE
Valid starting Expires Service principal
08/25/2019 18:53:40 08/26/2019 18:53:40 krbtgt/KENYA.KE@KENYA.KE
Bravo you have a valid ticket and hence access to the cluster let's validate that the below HDFS list directory should succeed
[konar@simba ~]$ hdfs dfs -ls /
Found 10 items
drwxrwxrwx - yarn hadoop 0 2018-12-17 21:53 /app-logs
drwxr-xr-x - hdfs hdfs 0 2018-09-24 00:22 /apps
drwxr-xr-x - yarn hadoop 0 2018-09-24 00:12 /ats
drwxr-xr-x - hdfs hdfs 0 2018-09-24 00:12 /hdp
drwxr-xr-x - mapred hdfs 0 2018-09-24 00:12 /mapred
drwxrwxrwx - mapred hadoop 0 2018-09-24 00:12 /mr-history
drwxr-xr-x - hdfs hdfs 0 2018-12-17 19:16 /ranger
drwxrwxrwx - spark hadoop 0 2019-08-25 18:59 /spark2-history
drwxrwxrwx - hdfs hdfs 0 2018-10-11 11:16 /tmp
drwxr-xr-x - hdfs hdfs 0 2018-09-24 00:23 /user
User konar can now list and execute jobs on the cluster !!!! as reiterated the konar user in a recommended architecture should be on the edge node.
Created 06-02-2021 03:26 AM
@Shelton wrote:
[konar@simba ~]$ kinit -kt konar.keytab konar@KENYA.KE
The above should throw any error
Now validate the user has a valid ticket
[konar@simba ~]$ klist
Ticket cache: FILE:/tmp/krb5cc_1024
Default principal: konar@KENYA.KEValid starting Expires Service principal
08/25/2019 18:53:40 08/26/2019 18:53:40 krbtgt/KENYA.KE@KENYA.KEBravo you have a valid ticket and hence access to the cluster let's validate that the below HDFS list directory should succeed
I am getting following error after executing kinit -kt command for hive user =
[hive@server-hdp ~]$ kinit -kt hive.keytab hive@MYDOMAIN.COM
kinit: Password incorrect while getting initial credentials
Pls suggest how to solve this issue thanks.
My krb5.conf =
[hive@server-hdp ~]$ cat /etc/krb5.conf
[libdefaults]
#renew_lifetime = 7d
forwardable = true
default_realm = MYDOMAIN.COM
ticket_lifetime = 24h
dns_lookup_realm = false
dns_lookup_kdc = false
default_ccache_name = /tmp/krb5cc_%{uid}
#default_tgs_enctypes = aes des3-cbc-sha1 rc4 des-cbc-md5
#default_tkt_enctypes = aes des3-cbc-sha1 rc4 des-cbc-md5
[domain_realm]
mydomain.com = MYDOMAIN.COM
[logging]
default = FILE:/var/log/krb5kdc.log
admin_server = FILE:/var/log/kadmind.log
kdc = FILE:/var/log/krb5kdc.log
[realms]
MYDOMAIN.COM = {
admin_server = server-hdp.mydomain.com
kdc = server-hdp.mydomain.com
}
keytab works for user1 & user1 can access hdfs without any issue.
Regards,
Amey.
Created 06-02-2021 03:54 AM
@Shelton If I try to use hdfs.headless.keytab =
[hive@server-hdp ~]$ kinit -kt /etc/security/keytabs/hdfs.headless.keytab
kinit: Client 'host/server-hdp.mydomain.com@MYDOMAIN.COM' not found in Kerberos database while getting initial credentials
[hive@server-hdp ~]$ klist
klist: No credentials cache found (filename: /tmp/krb5cc_1001)
[hive@server-hdp ~]$
Created 06-02-2021 01:50 PM
Please have a look at my other posting on keytabs
Having said that you are switched to the hive user and attempting to use hdfs-headless-keytab. That's not possible.
As the root user run the following steps
# su - hdfs
[hdfs@server-hdp ~]$ kinit -kt /etc/security/keytabs/hdfs.headless.keytab
Now you should have a valid ticket
[hdfs@server-hdp ~]$ klist
Happy hadooping !!!