Support Questions

Find answers, ask questions, and share your expertise

Hive JDBC client error when connecting to Kerberos Cloudera cluster

avatar

Hi All,

 

In my project I have a need to connect to Hive through JDBC. Developed a JDBC client program and connecting to it I am getting below error. What I need to do?

 

Cloudera CDH version (Hadoop 2.6.0-cdh5.4.3)

 

 

TUGIAssumingTransport.java:49)

2015-08-14 18:16:55 DEBUG TSaslTransport:261 - opening transport org.apache.thrift.transport.TSaslClientTransport@41c2284a
2015-08-14 18:16:55 ERROR TSaslTransport:315 - SASL negotiation failure
javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]

 

Below is a snippet of HiveClient program and below explained the steps I am executing

 

MyHiveClient {

 

public static Connection createConnection() {

 

Class.forName("org.apache.hive.jdbc.HiveDriver")

 

String hive2JDBCConnectionURL = "jdbc:hive2://hiveserver-ip-address:10000/default;principal=hive/_HOST@A.B.COM;sasl.qop=auth-conf""

 

return DriverManager.getConnection(hive2JDBCConnectionURL, new Properties())

 

}


My java class is invoked through a shell script (say run_metrics.sh ). This Java class internally creates a Hive JDBC connection by invoking above MyHiveClient.createConnection

 

This is what I do :

 

kinit

 

<enter password here>

 

./run_metrics.sh

<Now I get above error>

 

1 ACCEPTED SOLUTION

avatar

I am able to resolve this issue and below Oracle link helped me to resolve it

 

http://docs.oracle.com/javase/7/docs/technotes/guides/security/jgss/tutorials/Troubleshooting.html

GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos Ticket)

 

Solution is we need to specify -Djavax.security.auth.useSubjectCredsOnly=false while executing my Java program from command line

That means

java -Djavax.security.auth.useSubjectCredsOnly=false ...........

 

My Java program internally uses Hive JDBC API.

 

This is what I did:

 

1. kinit from command line

2. Run the Java program with above -D property and also in the JDBC URL specify the appropriate Hive JDBC URL with principal name etc

 

 

 

 

View solution in original post

6 REPLIES 6

avatar

Here is the additionl stack trace

 

2015-08-14 18:44:40 INFO Utils:285 - Supplied authorities: hiveserver-ip-address:10000
2015-08-14 18:44:40 WARN Utils:401 - ***** JDBC param deprecation *****
2015-08-14 18:44:40 WARN Utils:402 - The use of sasl.qop is deprecated.
2015-08-14 18:44:40 WARN Utils:403 - Please use saslQop like so: jdbc:hive2://<host>:<port>/dbName;saslQop=<qop_value>
2015-08-14 18:44:40 INFO Utils:372 - Resolved authority: hiveserver-ip-address:10000
2015-08-14 18:44:40 DEBUG MutableMetricsFactory:42 - field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginSuccess with annotation @org.apache.hadoop.metrics2.annotation.Metric(about=, sampleName=Ops, always=false, type=DEFAULT, valueName=Time, value=[Rate of successful kerberos logins and latency (milliseconds)])
2015-08-14 18:44:40 DEBUG MutableMetricsFactory:42 - field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginFailure with annotation @org.apache.hadoop.metrics2.annotation.Metric(about=, sampleName=Ops, always=false, type=DEFAULT, valueName=Time, value=[Rate of failed kerberos logins and latency (milliseconds)])
2015-08-14 18:44:40 DEBUG MutableMetricsFactory:42 - field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.getGroups with annotation @org.apache.hadoop.metrics2.annotation.Metric(about=, sampleName=Ops, always=false, type=DEFAULT, valueName=Time, value=[GetGroups])
2015-08-14 18:44:40 DEBUG MetricsSystemImpl:231 - UgiMetrics, User and group related metrics
2015-08-14 18:44:40 DEBUG Groups:301 - Creating new Groups object
2015-08-14 18:44:40 DEBUG NativeCodeLoader:46 - Trying to load the custom-built native-hadoop library...
2015-08-14 18:44:40 DEBUG NativeCodeLoader:55 - Failed to load native-hadoop with error: java.lang.UnsatisfiedLinkError: no hadoop in java.library.path
2015-08-14 18:44:40 DEBUG NativeCodeLoader:56 - java.library.path=/usr/java/packages/lib/amd64:/usr/lib64:/lib64:/lib:/usr/lib
2015-08-14 18:44:40 WARN NativeCodeLoader:62 - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2015-08-14 18:44:40 DEBUG PerformanceAdvisory:41 - Falling back to shell based
2015-08-14 18:44:40 DEBUG JniBasedUnixGroupsMappingWithFallback:45 - Group mapping impl=org.apache.hadoop.security.ShellBasedUnixGroupsMapping
2015-08-14 18:44:40 DEBUG Groups:112 - Group mapping impl=org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback; cacheTimeout=300000; warningDeltaMs=5000
2015-08-14 18:44:40 DEBUG UserGroupInformation:221 - hadoop login
2015-08-14 18:44:40 DEBUG UserGroupInformation:156 - hadoop login commit
2015-08-14 18:44:40 DEBUG UserGroupInformation:186 - using local user:UnixPrincipal: <my login name>
2015-08-14 18:44:40 DEBUG UserGroupInformation:192 - Using user: "UnixPrincipal: <my login name>" with name <my login name>
2015-08-14 18:44:40 DEBUG UserGroupInformation:202 - User entry: "<my login name>"
2015-08-14 18:44:40 DEBUG UserGroupInformation:840 - UGI loginUser:<my login name> (auth:SIMPLE)
2015-08-14 18:44:40 DEBUG HadoopThriftAuthBridge:155 - Current authMethod = SIMPLE
2015-08-14 18:44:40 DEBUG HadoopThriftAuthBridge:93 - Setting UGI conf as passed-in authMethod of kerberos != current.
2015-08-14 18:44:40 INFO HiveConnection:189 - Will try to open client transport with JDBC Uri: jdbc:hive2://hiveserver-ip-address:10000/default;principal=hive/_HOST@A.B.COM;sasl.qop=auth-conf
2015-08-14 18:44:40 DEBUG UserGroupInformation:1693 - PrivilegedAction as:<my login name> (auth:SIMPLE) from:org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49)
2015-08-14 18:44:40 DEBUG TSaslTransport:261 - opening transport org.apache.thrift.transport.TSaslClientTransport@41c2284a
2015-08-14 18:44:40 ERROR TSaslTransport:315 - SASL negotiation failure
javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]
at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:211)
at org.apache.thrift.transport.TSaslClientTransport.handleSaslStartMessage(TSaslClientTransport.java:94)
at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:271)
at org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37)
at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:52)
at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:49)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1671)
at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49)
at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:190)
at org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:163)
at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:105)
at java.sql.DriverManager.getConnection(DriverManager.java:664)
at java.sql.DriverManager.getConnection(DriverManager.java:208)

avatar

I am able to resolve this issue and below Oracle link helped me to resolve it

 

http://docs.oracle.com/javase/7/docs/technotes/guides/security/jgss/tutorials/Troubleshooting.html

GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos Ticket)

 

Solution is we need to specify -Djavax.security.auth.useSubjectCredsOnly=false while executing my Java program from command line

That means

java -Djavax.security.auth.useSubjectCredsOnly=false ...........

 

My Java program internally uses Hive JDBC API.

 

This is what I did:

 

1. kinit from command line

2. Run the Java program with above -D property and also in the JDBC URL specify the appropriate Hive JDBC URL with principal name etc

 

 

 

 

avatar
Community Manager

Congratulations on solving the issue and thank you for posting the solution in case others have the same one.


Cy Jervis, Manager, Community Program
Was your question answered? Make sure to mark the answer as the accepted solution.
If you find a reply useful, say thanks by clicking on the thumbs up button.

avatar
-Djavax.security.auth.useSubjectCredsOnly=false

That solved the issue with beeline on our external host. Thank you very much.

avatar
New Contributor

Thank you so much! it really works! 

java -Djavax.security.auth.useSubjectCredsOnly=false -jar <my jar name>

avatar
New Contributor

I was also getting "Failed to find any Kerberos tgt"

 

The " -Djavax.security.auth.useSubjectCredsOnly=false" pointer was the solution.  I was just about to give up on getting hplsql running.  It was frustrating having the exact same connection string work fine for beeline but causing that error for hplsql.  Thanks for posting the fix!