<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>question Re: Hive JDBC client error when connecting to Kerberos Cloudera cluster in Archives of Support Questions (Read Only)</title>
    <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Hive-JDBC-client-error-when-connecting-to-Kerberos-Cloudera/m-p/30831#M7005</link>
    <description>&lt;P&gt;Here is&amp;nbsp;the additionl stack trace&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;2015-08-14 18:44:40 INFO Utils:285 - Supplied authorities: hiveserver-ip-address:10000&lt;BR /&gt;2015-08-14 18:44:40 WARN Utils:401 - ***** JDBC param deprecation *****&lt;BR /&gt;2015-08-14 18:44:40 WARN Utils:402 - The use of sasl.qop is deprecated.&lt;BR /&gt;2015-08-14 18:44:40 WARN Utils:403 - Please use saslQop like so: jdbc:hive2://&amp;lt;host&amp;gt;:&amp;lt;port&amp;gt;/dbName;saslQop=&amp;lt;qop_value&amp;gt;&lt;BR /&gt;2015-08-14 18:44:40 INFO Utils:372 - Resolved authority: hiveserver-ip-address:10000&lt;BR /&gt;2015-08-14 18:44:40 DEBUG MutableMetricsFactory:42 - field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginSuccess with annotation @org.apache.hadoop.metrics2.annotation.Metric(about=, sampleName=Ops, always=false, type=DEFAULT, valueName=Time, value=[Rate of successful kerberos logins and latency (milliseconds)])&lt;BR /&gt;2015-08-14 18:44:40 DEBUG MutableMetricsFactory:42 - field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginFailure with annotation @org.apache.hadoop.metrics2.annotation.Metric(about=, sampleName=Ops, always=false, type=DEFAULT, valueName=Time, value=[Rate of failed kerberos logins and latency (milliseconds)])&lt;BR /&gt;2015-08-14 18:44:40 DEBUG MutableMetricsFactory:42 - field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.getGroups with annotation @org.apache.hadoop.metrics2.annotation.Metric(about=, sampleName=Ops, always=false, type=DEFAULT, valueName=Time, value=[GetGroups])&lt;BR /&gt;2015-08-14 18:44:40 DEBUG MetricsSystemImpl:231 - UgiMetrics, User and group related metrics&lt;BR /&gt;2015-08-14 18:44:40 DEBUG Groups:301 - Creating new Groups object&lt;BR /&gt;2015-08-14 18:44:40 DEBUG NativeCodeLoader:46 - Trying to load the custom-built native-hadoop library...&lt;BR /&gt;2015-08-14 18:44:40 DEBUG NativeCodeLoader:55 - Failed to load native-hadoop with error: java.lang.UnsatisfiedLinkError: no hadoop in java.library.path&lt;BR /&gt;2015-08-14 18:44:40 DEBUG NativeCodeLoader:56 - java.library.path=/usr/java/packages/lib/amd64:/usr/lib64:/lib64:/lib:/usr/lib&lt;BR /&gt;2015-08-14 18:44:40 WARN NativeCodeLoader:62 - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable&lt;BR /&gt;2015-08-14 18:44:40 DEBUG PerformanceAdvisory:41 - Falling back to shell based&lt;BR /&gt;2015-08-14 18:44:40 DEBUG JniBasedUnixGroupsMappingWithFallback:45 - Group mapping impl=org.apache.hadoop.security.ShellBasedUnixGroupsMapping&lt;BR /&gt;2015-08-14 18:44:40 DEBUG Groups:112 - Group mapping impl=org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback; cacheTimeout=300000; warningDeltaMs=5000&lt;BR /&gt;2015-08-14 18:44:40 DEBUG UserGroupInformation:221 - hadoop login&lt;BR /&gt;2015-08-14 18:44:40 DEBUG UserGroupInformation:156 - hadoop login commit&lt;BR /&gt;2015-08-14 18:44:40 DEBUG UserGroupInformation:186 - using local user:UnixPrincipal: &amp;lt;my login name&amp;gt;&lt;BR /&gt;2015-08-14 18:44:40 DEBUG UserGroupInformation:192 - Using user: "UnixPrincipal: &amp;lt;my login name&amp;gt;" with name &amp;lt;my login name&amp;gt;&lt;BR /&gt;2015-08-14 18:44:40 DEBUG UserGroupInformation:202 - User entry: "&amp;lt;my login name&amp;gt;"&lt;BR /&gt;2015-08-14 18:44:40 DEBUG UserGroupInformation:840 - UGI loginUser:&amp;lt;my login name&amp;gt; (auth:SIMPLE)&lt;BR /&gt;2015-08-14 18:44:40 DEBUG HadoopThriftAuthBridge:155 - Current authMethod = SIMPLE&lt;BR /&gt;2015-08-14 18:44:40 DEBUG HadoopThriftAuthBridge:93 - Setting UGI conf as passed-in authMethod of kerberos != current.&lt;BR /&gt;2015-08-14 18:44:40 INFO HiveConnection:189 - Will try to open client transport with JDBC Uri: jdbc:hive2://hiveserver-ip-address:10000/default;principal=hive/_HOST@A.B.COM;sasl.qop=auth-conf&lt;BR /&gt;2015-08-14 18:44:40 DEBUG UserGroupInformation:1693 - PrivilegedAction as:&amp;lt;my login name&amp;gt; (auth:SIMPLE) from:org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49)&lt;BR /&gt;2015-08-14 18:44:40 DEBUG TSaslTransport:261 - opening transport org.apache.thrift.transport.TSaslClientTransport@41c2284a&lt;BR /&gt;2015-08-14 18:44:40 ERROR TSaslTransport:315 - SASL negotiation failure&lt;BR /&gt;javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]&lt;BR /&gt;at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:211)&lt;BR /&gt;at org.apache.thrift.transport.TSaslClientTransport.handleSaslStartMessage(TSaslClientTransport.java:94)&lt;BR /&gt;at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:271)&lt;BR /&gt;at org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37)&lt;BR /&gt;at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:52)&lt;BR /&gt;at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:49)&lt;BR /&gt;at java.security.AccessController.doPrivileged(Native Method)&lt;BR /&gt;at javax.security.auth.Subject.doAs(Subject.java:422)&lt;BR /&gt;at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1671)&lt;BR /&gt;at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49)&lt;BR /&gt;at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:190)&lt;BR /&gt;at org.apache.hive.jdbc.HiveConnection.&amp;lt;init&amp;gt;(HiveConnection.java:163)&lt;BR /&gt;at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:105)&lt;BR /&gt;at java.sql.DriverManager.getConnection(DriverManager.java:664)&lt;BR /&gt;at java.sql.DriverManager.getConnection(DriverManager.java:208)&lt;/P&gt;</description>
    <pubDate>Fri, 14 Aug 2015 23:01:02 GMT</pubDate>
    <dc:creator>trainingmyhobby</dc:creator>
    <dc:date>2015-08-14T23:01:02Z</dc:date>
    <item>
      <title>Hive JDBC client error when connecting to Kerberos Cloudera cluster</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Hive-JDBC-client-error-when-connecting-to-Kerberos-Cloudera/m-p/30829#M7004</link>
      <description>&lt;P&gt;Hi All,&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;In my project I have a need to connect to Hive through JDBC. Developed a JDBC client program and connecting to it&amp;nbsp;I am getting below error. What I need to do?&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Cloudera CDH version (Hadoop 2.6.0-cdh5.4.3)&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;TUGIAssumingTransport.java:49)&lt;/P&gt;&lt;P&gt;2015-08-14 18:16:55 DEBUG TSaslTransport:261 - opening transport org.apache.thrift.transport.TSaslClientTransport@41c2284a&lt;BR /&gt;2015-08-14 18:16:55 ERROR TSaslTransport:315 - SASL negotiation failure&lt;BR /&gt;javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Below is a snippet of HiveClient program and below explained the steps I am executing&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;MyHiveClient {&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;public static Connection createConnection() {&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Class.forName("org.apache.hive.jdbc.HiveDriver")&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;String hive2JDBCConnectionURL = "jdbc:hive2://hiveserver-ip-address:10000/default;principal=hive/_HOST@A.B.COM;sasl.qop=auth-conf""&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;return DriverManager.getConnection(hive2JDBCConnectionURL, new Properties())&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;}&lt;/P&gt;&lt;P&gt;&lt;BR /&gt;My java class is invoked through a shell script (say run_metrics.sh ). This Java class internally creates a Hive JDBC connection by invoking above MyHiveClient.createConnection&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;This is what I do :&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;kinit&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;lt;enter password here&amp;gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;./run_metrics.sh&lt;/P&gt;&lt;P&gt;&amp;lt;Now I get above error&amp;gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Fri, 16 Sep 2022 09:37:54 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Hive-JDBC-client-error-when-connecting-to-Kerberos-Cloudera/m-p/30829#M7004</guid>
      <dc:creator>trainingmyhobby</dc:creator>
      <dc:date>2022-09-16T09:37:54Z</dc:date>
    </item>
    <item>
      <title>Re: Hive JDBC client error when connecting to Kerberos Cloudera cluster</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Hive-JDBC-client-error-when-connecting-to-Kerberos-Cloudera/m-p/30831#M7005</link>
      <description>&lt;P&gt;Here is&amp;nbsp;the additionl stack trace&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;2015-08-14 18:44:40 INFO Utils:285 - Supplied authorities: hiveserver-ip-address:10000&lt;BR /&gt;2015-08-14 18:44:40 WARN Utils:401 - ***** JDBC param deprecation *****&lt;BR /&gt;2015-08-14 18:44:40 WARN Utils:402 - The use of sasl.qop is deprecated.&lt;BR /&gt;2015-08-14 18:44:40 WARN Utils:403 - Please use saslQop like so: jdbc:hive2://&amp;lt;host&amp;gt;:&amp;lt;port&amp;gt;/dbName;saslQop=&amp;lt;qop_value&amp;gt;&lt;BR /&gt;2015-08-14 18:44:40 INFO Utils:372 - Resolved authority: hiveserver-ip-address:10000&lt;BR /&gt;2015-08-14 18:44:40 DEBUG MutableMetricsFactory:42 - field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginSuccess with annotation @org.apache.hadoop.metrics2.annotation.Metric(about=, sampleName=Ops, always=false, type=DEFAULT, valueName=Time, value=[Rate of successful kerberos logins and latency (milliseconds)])&lt;BR /&gt;2015-08-14 18:44:40 DEBUG MutableMetricsFactory:42 - field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginFailure with annotation @org.apache.hadoop.metrics2.annotation.Metric(about=, sampleName=Ops, always=false, type=DEFAULT, valueName=Time, value=[Rate of failed kerberos logins and latency (milliseconds)])&lt;BR /&gt;2015-08-14 18:44:40 DEBUG MutableMetricsFactory:42 - field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.getGroups with annotation @org.apache.hadoop.metrics2.annotation.Metric(about=, sampleName=Ops, always=false, type=DEFAULT, valueName=Time, value=[GetGroups])&lt;BR /&gt;2015-08-14 18:44:40 DEBUG MetricsSystemImpl:231 - UgiMetrics, User and group related metrics&lt;BR /&gt;2015-08-14 18:44:40 DEBUG Groups:301 - Creating new Groups object&lt;BR /&gt;2015-08-14 18:44:40 DEBUG NativeCodeLoader:46 - Trying to load the custom-built native-hadoop library...&lt;BR /&gt;2015-08-14 18:44:40 DEBUG NativeCodeLoader:55 - Failed to load native-hadoop with error: java.lang.UnsatisfiedLinkError: no hadoop in java.library.path&lt;BR /&gt;2015-08-14 18:44:40 DEBUG NativeCodeLoader:56 - java.library.path=/usr/java/packages/lib/amd64:/usr/lib64:/lib64:/lib:/usr/lib&lt;BR /&gt;2015-08-14 18:44:40 WARN NativeCodeLoader:62 - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable&lt;BR /&gt;2015-08-14 18:44:40 DEBUG PerformanceAdvisory:41 - Falling back to shell based&lt;BR /&gt;2015-08-14 18:44:40 DEBUG JniBasedUnixGroupsMappingWithFallback:45 - Group mapping impl=org.apache.hadoop.security.ShellBasedUnixGroupsMapping&lt;BR /&gt;2015-08-14 18:44:40 DEBUG Groups:112 - Group mapping impl=org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback; cacheTimeout=300000; warningDeltaMs=5000&lt;BR /&gt;2015-08-14 18:44:40 DEBUG UserGroupInformation:221 - hadoop login&lt;BR /&gt;2015-08-14 18:44:40 DEBUG UserGroupInformation:156 - hadoop login commit&lt;BR /&gt;2015-08-14 18:44:40 DEBUG UserGroupInformation:186 - using local user:UnixPrincipal: &amp;lt;my login name&amp;gt;&lt;BR /&gt;2015-08-14 18:44:40 DEBUG UserGroupInformation:192 - Using user: "UnixPrincipal: &amp;lt;my login name&amp;gt;" with name &amp;lt;my login name&amp;gt;&lt;BR /&gt;2015-08-14 18:44:40 DEBUG UserGroupInformation:202 - User entry: "&amp;lt;my login name&amp;gt;"&lt;BR /&gt;2015-08-14 18:44:40 DEBUG UserGroupInformation:840 - UGI loginUser:&amp;lt;my login name&amp;gt; (auth:SIMPLE)&lt;BR /&gt;2015-08-14 18:44:40 DEBUG HadoopThriftAuthBridge:155 - Current authMethod = SIMPLE&lt;BR /&gt;2015-08-14 18:44:40 DEBUG HadoopThriftAuthBridge:93 - Setting UGI conf as passed-in authMethod of kerberos != current.&lt;BR /&gt;2015-08-14 18:44:40 INFO HiveConnection:189 - Will try to open client transport with JDBC Uri: jdbc:hive2://hiveserver-ip-address:10000/default;principal=hive/_HOST@A.B.COM;sasl.qop=auth-conf&lt;BR /&gt;2015-08-14 18:44:40 DEBUG UserGroupInformation:1693 - PrivilegedAction as:&amp;lt;my login name&amp;gt; (auth:SIMPLE) from:org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49)&lt;BR /&gt;2015-08-14 18:44:40 DEBUG TSaslTransport:261 - opening transport org.apache.thrift.transport.TSaslClientTransport@41c2284a&lt;BR /&gt;2015-08-14 18:44:40 ERROR TSaslTransport:315 - SASL negotiation failure&lt;BR /&gt;javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]&lt;BR /&gt;at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:211)&lt;BR /&gt;at org.apache.thrift.transport.TSaslClientTransport.handleSaslStartMessage(TSaslClientTransport.java:94)&lt;BR /&gt;at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:271)&lt;BR /&gt;at org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37)&lt;BR /&gt;at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:52)&lt;BR /&gt;at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:49)&lt;BR /&gt;at java.security.AccessController.doPrivileged(Native Method)&lt;BR /&gt;at javax.security.auth.Subject.doAs(Subject.java:422)&lt;BR /&gt;at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1671)&lt;BR /&gt;at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49)&lt;BR /&gt;at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:190)&lt;BR /&gt;at org.apache.hive.jdbc.HiveConnection.&amp;lt;init&amp;gt;(HiveConnection.java:163)&lt;BR /&gt;at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:105)&lt;BR /&gt;at java.sql.DriverManager.getConnection(DriverManager.java:664)&lt;BR /&gt;at java.sql.DriverManager.getConnection(DriverManager.java:208)&lt;/P&gt;</description>
      <pubDate>Fri, 14 Aug 2015 23:01:02 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Hive-JDBC-client-error-when-connecting-to-Kerberos-Cloudera/m-p/30831#M7005</guid>
      <dc:creator>trainingmyhobby</dc:creator>
      <dc:date>2015-08-14T23:01:02Z</dc:date>
    </item>
    <item>
      <title>Re: Hive JDBC client error when connecting to Kerberos Cloudera cluster</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Hive-JDBC-client-error-when-connecting-to-Kerberos-Cloudera/m-p/30957#M7006</link>
      <description>&lt;P&gt;I am able to resolve this issue and below Oracle link helped me to resolve it&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;A href="http://docs.oracle.com/javase/7/docs/technotes/guides/security/jgss/tutorials/Troubleshooting.html" target="_blank"&gt;http://docs.oracle.com/javase/7/docs/technotes/guides/security/jgss/tutorials/Troubleshooting.html&lt;/A&gt;&lt;/P&gt;&lt;P&gt;&lt;STRONG&gt;GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos Ticket)&lt;/STRONG&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Solution is we need to specify&amp;nbsp;&lt;SPAN&gt;-Djavax.security.auth.useSubjectCredsOnly=false while executing my Java program from command line&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;That means&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;java&amp;nbsp;-Djavax.security.auth.useSubjectCredsOnly=false ...........&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;My Java program internally uses Hive JDBC API.&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;This is what I did:&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;1. kinit from command line&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;2. Run the Java program with above -D property and also in the JDBC URL specify the appropriate Hive JDBC URL with principal name etc&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Wed, 19 Aug 2015 22:47:04 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Hive-JDBC-client-error-when-connecting-to-Kerberos-Cloudera/m-p/30957#M7006</guid>
      <dc:creator>trainingmyhobby</dc:creator>
      <dc:date>2015-08-19T22:47:04Z</dc:date>
    </item>
    <item>
      <title>Re: Hive JDBC client error when connecting to Kerberos Cloudera cluster</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Hive-JDBC-client-error-when-connecting-to-Kerberos-Cloudera/m-p/30973#M7007</link>
      <description>&lt;P&gt;Congratulations on solving the issue and thank you for posting the solution in case others have the same one.&lt;/P&gt;</description>
      <pubDate>Thu, 20 Aug 2015 12:12:19 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Hive-JDBC-client-error-when-connecting-to-Kerberos-Cloudera/m-p/30973#M7007</guid>
      <dc:creator>cjervis</dc:creator>
      <dc:date>2015-08-20T12:12:19Z</dc:date>
    </item>
    <item>
      <title>Re: Hive JDBC client error when connecting to Kerberos Cloudera cluster</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Hive-JDBC-client-error-when-connecting-to-Kerberos-Cloudera/m-p/60878#M7008</link>
      <description>-Djavax.security.auth.useSubjectCredsOnly=false&lt;BR /&gt;&lt;BR /&gt;That solved the issue with beeline on our external host. Thank you very much.&lt;BR /&gt;</description>
      <pubDate>Thu, 12 Oct 2017 21:08:28 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Hive-JDBC-client-error-when-connecting-to-Kerberos-Cloudera/m-p/60878#M7008</guid>
      <dc:creator>manishsingh2k</dc:creator>
      <dc:date>2017-10-12T21:08:28Z</dc:date>
    </item>
    <item>
      <title>Re: Hive JDBC client error when connecting to Kerberos Cloudera cluster</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Hive-JDBC-client-error-when-connecting-to-Kerberos-Cloudera/m-p/68006#M7009</link>
      <description>&lt;P&gt;I was also getting "&lt;SPAN&gt;Failed to find any Kerberos tgt"&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;The "&amp;nbsp;-Djavax.security.auth.useSubjectCredsOnly=false" pointer was the solution.&amp;nbsp; I was just about to give up on getting hplsql running.&amp;nbsp; It was frustrating having the exact same connection string work fine for beeline but causing that error for hplsql.&amp;nbsp; Thanks for posting the fix!&lt;/SPAN&gt;&lt;/P&gt;</description>
      <pubDate>Fri, 08 Jun 2018 17:31:56 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Hive-JDBC-client-error-when-connecting-to-Kerberos-Cloudera/m-p/68006#M7009</guid>
      <dc:creator>cupdike</dc:creator>
      <dc:date>2018-06-08T17:31:56Z</dc:date>
    </item>
    <item>
      <title>Re: Hive JDBC client error when connecting to Kerberos Cloudera cluster</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Hive-JDBC-client-error-when-connecting-to-Kerberos-Cloudera/m-p/77205#M7010</link>
      <description>&lt;P&gt;Thank you so much! it really works!&amp;nbsp;&lt;/P&gt;&lt;P&gt;java -Djavax.security.auth.useSubjectCredsOnly=false -jar &amp;lt;my jar name&amp;gt;&lt;/P&gt;</description>
      <pubDate>Fri, 20 Jul 2018 08:41:47 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Hive-JDBC-client-error-when-connecting-to-Kerberos-Cloudera/m-p/77205#M7010</guid>
      <dc:creator>Arc</dc:creator>
      <dc:date>2018-07-20T08:41:47Z</dc:date>
    </item>
  </channel>
</rss>

