Archives of Support Questions (Read Only)

This is an archived board for historical reference. Information and links may no longer be available or relevant
Announcements
This board is archived and read-only for historical reference. To ask a new question, please post a new topic on the appropriate active board.

Kerberos ticket expired ( kinit keytab successfully , java secure policy applied )

avatar
Explorer

 

Dear all,

 

I got the following error in my HDFS datanodes. However , kinit with the keytab files that generated by the CM is fine and java security policy has been applied.

 

Is there anyway to diagnose such a problem?

 

For more information:

no LDAP is set in the cluster nodes. cloudera manager and CDH version 5.13.2.

 

javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Ticket expired (32) - PROCESS_TGS)]
	at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:212)
	at org.apache.hadoop.security.SaslRpcClient.saslConnect(SaslRpcClient.java:413)
	at org.apache.hadoop.ipc.Client$Connection.setupSaslConnection(Client.java:594)
	at org.apache.hadoop.ipc.Client$Connection.access$2000(Client.java:396)
	at org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:761)
	at org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:757)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:415)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1920)
	at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:756)
	at org.apache.hadoop.ipc.Client$Connection.access$3000(Client.java:396)
	at org.apache.hadoop.ipc.Client.getConnection(Client.java:1557)
	at org.apache.hadoop.ipc.Client.call(Client.java:1480)
	at org.apache.hadoop.ipc.Client.call(Client.java:1441)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:230)
	at com.sun.proxy.$Proxy23.versionRequest(Unknown Source)
	at org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolClientSideTranslatorPB.versionRequest(DatanodeProtocolClientSideTranslatorPB.java:275)
	at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.retrieveNamespaceInfo(BPServiceActor.java:168)
	at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAndHandshake(BPServiceActor.java:214)
	at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:673)
	at java.lang.Thread.run(Thread.java:745)
Caused by: GSSException: No valid credentials provided (Mechanism level: Ticket expired (32) - PROCESS_TGS)
	at sun.security.jgss.krb5.Krb5Context.initSecContext(Krb5Context.java:710)
	at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:248)
	at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179)
	at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:193)
	... 20 more
Caused by: KrbException: Ticket expired (32) - PROCESS_TGS
	at sun.security.krb5.KrbTgsRep.<init>(KrbTgsRep.java:73)
	at sun.security.krb5.KrbTgsReq.getReply(KrbTgsReq.java:192)
	at sun.security.krb5.KrbTgsReq.sendAndGetCreds(KrbTgsReq.java:203)
	at sun.security.krb5.internal.CredentialsUtil.serviceCreds(CredentialsUtil.java:309)
	at sun.security.krb5.internal.CredentialsUtil.acquireServiceCreds(CredentialsUtil.java:115)
	at sun.security.krb5.Credentials.acquireServiceCreds(Credentials.java:454)
	at sun.security.jgss.krb5.Krb5Context.initSecContext(Krb5Context.java:641)
	... 23 more
Caused by: KrbException: Identifier doesn't match expected value (906)
	at sun.security.krb5.internal.KDCRep.init(KDCRep.java:143)
	at sun.security.krb5.internal.TGSRep.init(TGSRep.java:66)
	at sun.security.krb5.internal.TGSRep.<init>(TGSRep.java:61)
	at sun.security.krb5.KrbTgsRep.<init>(KrbTgsRep.java:55)
	... 29 more

 

 

In the /var/log/krb5kdc.log, I got some message like that:

 

Oct 02 09:52:12 kdc.fqdn krb5kdc[2426](info): TGS_REQ (1 etypes {23}) 172.32.237.83: PROCESS_TGS: authtime 0, hdfs/datanode.fqdn@REALM for hdfs/namenode.fqdn@REALM, Ticket expired

 

 

Thanks,

Roy

1 ACCEPTED SOLUTION

avatar
Explorer

Dear all,

 

After I updated the kerberos packages, the cluster resumed.

Hope this help. Thanks.

 

For more information: OS: CentOS Linux release 7.5.1804 (Core)

 

Packages version after update:

[root@namenode x86_64]# rpm -qa | grep krb
krb5-server-1.15.1-19.el7.x86_64
sssd-krb5-common-1.16.0-19.el7.x86_64
sssd-krb5-1.16.0-19.el7.x86_64
krb5-libs-1.15.1-19.el7.x86_64
krb5-devel-1.15.1-19.el7.x86_64
krb5-workstation-1.15.1-19.el7.x86_64

 

 

[root@datanode01 ~]# rpm -qa | grep krb
krb5-devel-1.15.1-19.el7.x86_64
krb5-workstation-1.15.1-19.el7.x86_64
sssd-krb5-common-1.16.0-19.el7.x86_64
sssd-krb5-1.16.0-19.el7.x86_64
krb5-libs-1.15.1-19.el7.x86_64

 

Roy

View solution in original post

11 REPLIES 11

avatar
Contributor
Hi, Thanks for the resolution. Is there any specific procedures to upgrade krb packages to 1.15.1-19?..
Detailed steps to upgrade and post upgrade to start hadoop cluster would be more helpful

avatar
Explorer

Hi Prash,

Since i upgraded the packages in the early stage of the setup.

I just created a dump (backup) for the Kerberos , and use the yum update command to upgrade it.

After updated the packages, the cluster could be started.