Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

Kerberos authentication with hive JDBC driver

avatar
New Contributor

I am trying to connect to Hive database with Oracle SQL developer using Cloudera Hive JDBC drivers.

I keep getting following error message

 

Status : Failure -Test failed: [Cloudera][HiveJDBCDriver](500164) Error initialized or created transport for authentication: [Cloudera][HiveJDBCDriver](500169) Unable to connect to server: GSS initiate failed.

 

My Hadoop environment has

  • Hive version 1.2.1.2.3
  • Kerberos version 1.10.3-10

I am trying to connect to this Hive database from Windows 7 64-bit machine which has

  • Sqldeveloper version 4.2.0.16.356.1154
  • Cloudera_Hive JDBC4 driver 2.5.18.1050
  • MIT Kerberos app version 4.1

 

Important: Windows machine I am connecting from is on a different domain as that of Hadoop cluster.

 

I have followed instructions from Using SQL Developer to access Apache Hive with kerberos authentication and steps I have performed are.

  1. Imported all the jar files to SQL Developer from the JDBC driver .
  2. Updated Java Crypto jars (local_policy.jar and US_export_policy.jap in sqldeveloper\jdk\jre\lib\security folder) with ones provided in UnlimitedJCEPolicy.zip.
  3. Created an environment variable  KRB5CCNAME whose value is set to C:\sqldeveloper\ktfile.keytab
  4. Installed MIT kerberos 4.1 64-bit app
  5. Acquired valid ticked (via kinit/hrough the app)
  6. Picture below shows the connection details

Hive connection details were:

 

 

Host name: machine.test.group
port: 10010
database: default

Krbservicename: hive
AuthMech: 1
KrbFQDN: machine.test.group
KrbRealm:  dev.mycompany.com

 

Can someone please advise me what I can do to fix the issue and connect to Hive using JDBC drivers.

8 REPLIES 8

avatar
Champion

@majeedk

 

I am using Linux but never tried from windows. Still hope it may give some insights

 

1. Create a keytab (for the source login) in the environment where you have kerberos installed. And keep the keytab file somewhere like /home/user/.auth/example.keytab (Change the path to windows equalent)


2. Create a shell to call the kinit (change the shell, kinit command to windows equalent)
kinit user@REALM -k -t /home/user/.auth/example.keytab


3. Create a cron job (or anyother job which suits for windows) to call the above shell in a frequent interval. Becuase by default, Kerberos ticket will expire in a week, so you need to run a job to kinit the process in a frequent interval

 

Thanks

Kumar

avatar
New Contributor

Hi @saranvisa

I have tried your suggestion but I am still getting error while trying to connect to Hive.

 

  • I copied the keytab file from hadoop cluster to my windows machine  (ktfile.keytab)
  • I ran the following command from dir where I copied the keytab file 

 

kinit -k -t ktfile.keytab USER@dev.mycompany.com

 

  • The above commands get a valid ticket and I can view thsi using MIT Kerberos app.
  • After this If I try to connect to Hive database from SQL Developer I get following error. 
Status : Failure -Test failed: [Cloudera][HiveJDBCDriver](500164) Error initialized or created transport for authentication: [Cloudera][HiveJDBCDriver](500169) Unable to connect to server: GSS initiate failed.

I have noticed that the above kinit command appends data to my original keytab file and any subsequent attempts to run the kinit command to get a tickect results in following error.

 

kinit: Unsupported key table format version number while getting initial credentials

 

 

 

avatar
Champion

@majeedk

 

 

Status : Failure -Test failed: [Cloudera][HiveJDBCDriver](500164) Error initialized or created transport for authentication: [Cloudera][HiveJDBCDriver](500169) Unable to connect to server: GSS initiate failed.

 

The error message that you showed above is the partial (or) full error? If it is partial, then pls check your error message has any keyword like 'krb'... 

 

Ex:
Caused by: GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)
at sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:147)
at sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:122)

 

If you don't find any krb related error then pls do not focus on kerberos... instead you can focus on other points like the port has been opened between the environment, etc like that

 

 

avatar
New Contributor

 @saranvisa

 

The entire error message I see in Oracle SQL Developer is

 

An error was encountered performing the requested operation:

[Cloudera][HiveJDBCDriver](500164) Error initialized or created transport for authentication: [Cloudera][HiveJDBCDriver](500169) Unable to connect to server: GSS initiate failed.

Vendor code 500164

 

Does it mean it is not kerberos error message?

 

PS: Does having Hadoop cluster (Linux) on different domain than my client machine (Windows) have any bearing on connection?

avatar
Champion

@majeedk

 

let me put in this way...

1. do you have access to your linux box by any chance. if so pls login and authenticate with your keytab to make sure everything is ok in your keytab.

2. if you don't have access, ask your admin (those who has access) to use your keytab to authenticate and confirm there is no issue with keytab.

3. In the mean time, the error doesn't show anything about kerberos [usually it will show krb issue if you access from the linux box, but i never tried from windows machine, so not sure]. So you need to make sure the Port is open, Firewall related check, etc... to make sure everything is ok other than kerberos issue

avatar
Super Collaborator

Hi,

 

If you are using the KRB5CCNAME variable it should not target the keytab file. That is the reason you cannot kinit twice with the same keytab.

 

That variable should target the Kerberos ticket cache generated using the keytab or login/password.

 

Also, you need to get a valid Kerberos ticket using the same Kerberos REALM as your Hadoop cluster is using.

 

avatar
Expert Contributor
Hello,
Did you resolve this SQL Developer connection error? If so what was the solution as I have the same issue. Thanks!

avatar
Expert Contributor

This issue was resolved by following the instructions in this site: http://vijayjt.blogspot.com/2016/02/how-to-connect-to-kerberised-chd-hadoop.html

 

We need to copy the Java JCE unlimited strength policy files and the krb5.conf file under jdk/jre/lib/security folder where SQL Developer is installed. After this the Hive connection via Kerberos was successful.