I am trying to connect to Hive database with Oracle SQL developer using Cloudera Hive JDBC drivers.
I keep getting following error message
My Hadoop environment has
I am trying to connect to this Hive database from Windows 7 64-bit machine which has
Important: Windows machine I am connecting from is on a different domain as that of Hadoop cluster.
I have followed instructions from Using SQL Developer to access Apache Hive with kerberos authentication and steps I have performed are.
Hive connection details were:
Host name: machine.test.group port: 10010 database: default Krbservicename: hive AuthMech: 1 KrbFQDN: machine.test.group KrbRealm: dev.mycompany.com
Can someone please advise me what I can do to fix the issue and connect to Hive using JDBC drivers.
I am using Linux but never tried from windows. Still hope it may give some insights
1. Create a keytab (for the source login) in the environment where you have kerberos installed. And keep the keytab file somewhere like /home/user/.auth/example.keytab (Change the path to windows equalent)
2. Create a shell to call the kinit (change the shell, kinit command to windows equalent)
kinit user@REALM -k -t /home/user/.auth/example.keytab
3. Create a cron job (or anyother job which suits for windows) to call the above shell in a frequent interval. Becuase by default, Kerberos ticket will expire in a week, so you need to run a job to kinit the process in a frequent interval
I have tried your suggestion but I am still getting error while trying to connect to Hive.
kinit -k -t ktfile.keytab USER@dev.mycompany.com
I have noticed that the above kinit command appends data to my original keytab file and any subsequent attempts to run the kinit command to get a tickect results in following error.
kinit: Unsupported key table format version number while getting initial credentials
The error message that you showed above is the partial (or) full error? If it is partial, then pls check your error message has any keyword like 'krb'...
Ex: Caused by: GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt) at sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:147) at sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:122)
If you don't find any krb related error then pls do not focus on kerberos... instead you can focus on other points like the port has been opened between the environment, etc like that
The entire error message I see in Oracle SQL Developer is
An error was encountered performing the requested operation: [Cloudera][HiveJDBCDriver](500164) Error initialized or created transport for authentication: [Cloudera][HiveJDBCDriver](500169) Unable to connect to server: GSS initiate failed. Vendor code 500164
Does it mean it is not kerberos error message?
PS: Does having Hadoop cluster (Linux) on different domain than my client machine (Windows) have any bearing on connection?
let me put in this way...
1. do you have access to your linux box by any chance. if so pls login and authenticate with your keytab to make sure everything is ok in your keytab.
2. if you don't have access, ask your admin (those who has access) to use your keytab to authenticate and confirm there is no issue with keytab.
3. In the mean time, the error doesn't show anything about kerberos [usually it will show krb issue if you access from the linux box, but i never tried from windows machine, so not sure]. So you need to make sure the Port is open, Firewall related check, etc... to make sure everything is ok other than kerberos issue
If you are using the KRB5CCNAME variable it should not target the keytab file. That is the reason you cannot kinit twice with the same keytab.
That variable should target the Kerberos ticket cache generated using the keytab or login/password.
Also, you need to get a valid Kerberos ticket using the same Kerberos REALM as your Hadoop cluster is using.
This issue was resolved by following the instructions in this site: http://vijayjt.blogspot.com/2016/02/how-to-connect-to-kerberised-chd-hadoop.html
We need to copy the Java JCE unlimited strength policy files and the krb5.conf file under jdk/jre/lib/security folder where SQL Developer is installed. After this the Hive connection via Kerberos was successful.