Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

Enabling Oozie and Storm Web UI using Cross Realm

avatar
Explorer

Have anyone succeeded in configuring/accessing kerberos enabled oozie webui using AD Realm.

Right now our cluster is configured with a local realm that services use and a cross realm that is connected to AD.All the users will be using their enterprise account to get a ticket for using against any hadoop services.

While trying to do the same form our local machine for accessing oozie and strom webui we are facing below exception

GSSException: Failure unspecified at GSS-API level (Mechanism level: Specified version of key is not available (44))

When verifying the ticket using klist the output is as follows

hw11980:dev msundaram$ klist

Credentials cache: API:3911E669-2B88-401B-8291-0352420190A7

Principal: msundar1@GSM1900.ORG

Issued Expires Principal

Nov 2 18:00:58 2015 Nov 3 04:00:58 2015 krbtgt/GSM1900.ORG@GSM1900.ORG

Nov 2 18:01:13 2015 Nov 3 04:00:58 2015 HTTP/devehdp004.unix.gsm1900.org@GSM1900.ORG

When trying with local realm,We are able to access the UI without any issues.And the klist looks like this

hw11980:dev msundaram$ kinit -k -t hdpsrvc.keytab hdpsrvc

hw11980:dev msundaram$ klist

Credentials cache: API:C30071FF-156B-4608-940F-3C10D800F519

Principal: hdpsrvc@HDP_EIT_DEV.com

Issued Expires Principal

Nov 2 18:06:49 2015 Nov 3 18:06:49 2015 krbtgt/HDP_EIT_DEV.com@HDP_EIT_DEV.com

Nov 2 18:06:57 2015 Nov 3 18:06:49 2015 HTTP/devehdp004.unix.gsm1900.org@HDP_EIT_DEV.com

hw11980:dev msundaram$

We are not sure how to handle the SPENGO Ticket[ HTTP/devehdp004.unix.gsm1900.org@GSM1900.ORG] that is created for AD.

Anyone has succeeded configuring this earlier?

We need help on configuring this.

Right now we have configured HUE with Oozie but it goes down frequently as already too many users are using it for HIVE/HDFS.

The same is working in our QAT environment and our klist looks like this

hw11980:dev msundaram$ klist Credentials cache: API:22D5D25B-E5F2-4372-AFB2-34B0944DA683 Principal: msundar1@GSMTEST.ORG Issued Expires Principal Nov 2 18:22:52 2015 Nov 3 04:22:52 2015 krbtgt/GSMTEST.ORG@GSMTEST.ORG hw11980:dev msundaram$ klist Credentials cache: API:22D5D25B-E5F2-4372-AFB2-34B0944DA683 Principal: msundar1@GSMTEST.ORG Issued Expires Principal Nov 2 18:22:52 2015 Nov 3 04:22:52 2015 krbtgt/GSMTEST.ORG@GSMTEST.ORG Nov 2 18:58:33 2015 Nov 3 04:22:52 2015 krbtgt/HDP_EIT_QAT.COM@GSMTEST.ORG

Nov 2 18:58:33 2015 Nov 3 04:22:52 2015 HTTP/qatehdp003.unix.gsm1900.org@HDP_EIT_QAT.COM

So it invokes the krbtgt/HDP_EIT_QAT.COM@GSMTEST.ORG cross realm service principal properly

1 ACCEPTED SOLUTION

avatar

I have an environment configured that is similar to yours (Hadoop cluster uses realm XYC.COM, but users can use XYC.COM, ABC.COM, ZET.COM). Users that have a valid Kerberos ticket can use the Storm or Oozie UI, which are secured with Spnego. What Kerberos version is this ? MIT KDC?

Can you post your OS, Java, HDP version? thanks

The error you are getting is related secret key (KVNO=Key version number) that is used to authenticate your user with the KDC and to obtain and encrypt the Kerberos tickets.

A tag associated with encrypted data identifies which key was used for encryption when a long-lived key associated with a principal changes over time. It is used during the transition to a new key so that the party decrypting a message can tell whether the data was encrypted with the old or the new key. (RFC-4120)

The error occurs because the key version of your ticket is different than the one on the KDC server. This happens for example when the user changes its password or a new secret key is generated for the service principals and the Keytab files contain the old KVNO.

For example:

  • User gets ticket from KDC with kvno=1
  • User changes password => KVNO is changed to kvno=2
  • KVNO change is picked up by the server
  • Old User ticket is still valid because user machine was never restarted and the ticket cache never cleared
  • Next access request to the server will fail since the key version numbers are different

Possible solutions:

  • Regenerate Keytabs
  • Destroy user ticket and purge cache (reboot should clear cache)

View solution in original post

2 REPLIES 2

avatar

I have an environment configured that is similar to yours (Hadoop cluster uses realm XYC.COM, but users can use XYC.COM, ABC.COM, ZET.COM). Users that have a valid Kerberos ticket can use the Storm or Oozie UI, which are secured with Spnego. What Kerberos version is this ? MIT KDC?

Can you post your OS, Java, HDP version? thanks

The error you are getting is related secret key (KVNO=Key version number) that is used to authenticate your user with the KDC and to obtain and encrypt the Kerberos tickets.

A tag associated with encrypted data identifies which key was used for encryption when a long-lived key associated with a principal changes over time. It is used during the transition to a new key so that the party decrypting a message can tell whether the data was encrypted with the old or the new key. (RFC-4120)

The error occurs because the key version of your ticket is different than the one on the KDC server. This happens for example when the user changes its password or a new secret key is generated for the service principals and the Keytab files contain the old KVNO.

For example:

  • User gets ticket from KDC with kvno=1
  • User changes password => KVNO is changed to kvno=2
  • KVNO change is picked up by the server
  • Old User ticket is still valid because user machine was never restarted and the ticket cache never cleared
  • Next access request to the server will fail since the key version numbers are different

Possible solutions:

  • Regenerate Keytabs
  • Destroy user ticket and purge cache (reboot should clear cache)

avatar

As you are accesing it from the Cross realm, your widows-based clients should know the KDC location to create proper SPNEGO token. You need either create a separate policy in your AD for that type of users or you can manually from command line add mappings in client machine's registeries so that they would know about your Hadoop realm KDC:

ksetup /addkdc HADOOP.DOMAIN.COM hadoop.nodewithkdc.com

ksetup /addhosttorealmmap hadoop.nodewithkdc.com HADOOP.DOMAIN.COM