<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>question HCatalog and kerberos in Archives of Support Questions (Read Only)</title>
    <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/HCatalog-and-kerberos/m-p/117655#M22265</link>
    <description>&lt;P&gt;I have a java
application which reads hive metadata using hcatalog apis. &lt;/P&gt;&lt;PRE&gt;public static void main(String[] args) {		HCatClient hcatClient = null;try {			HiveConf hcatConf = new HiveConf();hcatConf.setVar(HiveConf.ConfVars.METASTOREURIS, "thrift://192.168.42.154:9083");hcatConf.set(HCatConstants.HCAT_HIVE_CLIENT_DISABLE_CACHE, "true");hcatClient = HCatClient.create(new Configuration(hcatConf));			List&amp;lt;String&amp;gt; dbs = hcatClient.listDatabaseNamesByPattern("*");for (String string : dbs) {				System.out.println(string);			}		} catch (Throwable t) {t.printStackTrace();		} finally {if (hcatClient != null)try {hcatClient.close();				} catch (HCatException e) {				}		}	}&lt;/PRE&gt;&lt;P&gt;I get the following exception on a cluster with Kerberos&lt;/P&gt;&lt;PRE&gt;org.apache.hive.hcatalog.common.HCatException : 9001 : Exception occurred while
processing HCat request : MetaException while listing db names. Cause :
MetaException(message:Got exception: org.apache.thrift.transport.TTransportException
java.net.SocketTimeoutException: Read timed out)org.apache.hive.hcatalog.common.HCatException
: 9001 : Exception occurred while processing HCat request : MetaException while
listing db names. Cause : MetaException(message:Got exception:
org.apache.thrift.transport.TTransportException java.net.SocketTimeoutException:
Read timed out)  at
org.apache.hive.hcatalog.api.HCatClientHMSImpl.listDatabaseNamesByPattern(HCatClientHMSImpl.java:68)&lt;/PRE&gt;</description>
    <pubDate>Wed, 09 Mar 2016 04:03:02 GMT</pubDate>
    <dc:creator>rjotwani-211490215</dc:creator>
    <dc:date>2016-03-09T04:03:02Z</dc:date>
    <item>
      <title>HCatalog and kerberos</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/HCatalog-and-kerberos/m-p/117655#M22265</link>
      <description>&lt;P&gt;I have a java
application which reads hive metadata using hcatalog apis. &lt;/P&gt;&lt;PRE&gt;public static void main(String[] args) {		HCatClient hcatClient = null;try {			HiveConf hcatConf = new HiveConf();hcatConf.setVar(HiveConf.ConfVars.METASTOREURIS, "thrift://192.168.42.154:9083");hcatConf.set(HCatConstants.HCAT_HIVE_CLIENT_DISABLE_CACHE, "true");hcatClient = HCatClient.create(new Configuration(hcatConf));			List&amp;lt;String&amp;gt; dbs = hcatClient.listDatabaseNamesByPattern("*");for (String string : dbs) {				System.out.println(string);			}		} catch (Throwable t) {t.printStackTrace();		} finally {if (hcatClient != null)try {hcatClient.close();				} catch (HCatException e) {				}		}	}&lt;/PRE&gt;&lt;P&gt;I get the following exception on a cluster with Kerberos&lt;/P&gt;&lt;PRE&gt;org.apache.hive.hcatalog.common.HCatException : 9001 : Exception occurred while
processing HCat request : MetaException while listing db names. Cause :
MetaException(message:Got exception: org.apache.thrift.transport.TTransportException
java.net.SocketTimeoutException: Read timed out)org.apache.hive.hcatalog.common.HCatException
: 9001 : Exception occurred while processing HCat request : MetaException while
listing db names. Cause : MetaException(message:Got exception:
org.apache.thrift.transport.TTransportException java.net.SocketTimeoutException:
Read timed out)  at
org.apache.hive.hcatalog.api.HCatClientHMSImpl.listDatabaseNamesByPattern(HCatClientHMSImpl.java:68)&lt;/PRE&gt;</description>
      <pubDate>Wed, 09 Mar 2016 04:03:02 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/HCatalog-and-kerberos/m-p/117655#M22265</guid>
      <dc:creator>rjotwani-211490215</dc:creator>
      <dc:date>2016-03-09T04:03:02Z</dc:date>
    </item>
    <item>
      <title>Re: HCatalog and kerberos</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/HCatalog-and-kerberos/m-p/117656#M22266</link>
      <description>&lt;P&gt;This is the error I see in the hivemetastore.log&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;cmd=get_all_databases 
2016-03-14 06:25:47,041 INFO  [pool-5-thread-197]: metastore.HiveMetaStore (HiveMetaStore.java:newRawStore(590)) - 195: Opening raw store with implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
2016-03-14 06:25:47,041 INFO  [pool-5-thread-197]: metastore.ObjectStore (ObjectStore.java:initialize(290)) - ObjectStore, initialize called
2016-03-14 06:25:47,042 WARN  [pool-5-thread-197]: metastore.MetaStoreDirectSql (MetaStoreDirectSql.java:determineDbType(160)) - DB Product name[PostgreSQL] obtained, but not used to determine db type. Falling back to using SQL to determine which db we're using
2016-03-14 06:25:47,044 INFO  [pool-5-thread-197]: metastore.MetaStoreDirectSql (MetaStoreDirectSql.java:&amp;lt;init&amp;gt;(140)) - Using direct SQL, underlying DB is OTHER
2016-03-14 06:25:47,045 INFO  [pool-5-thread-197]: metastore.ObjectStore (ObjectStore.java:setConf(273)) - Initialized ObjectStore
2016-03-14 06:26:03,614 ERROR [pool-5-thread-197]: server.TThreadPoolServer (TThreadPoolServer.java:run(296)) - Error occurred during processing of message.
java.lang.RuntimeException: org.apache.thrift.transport.TTransportException: Peer indicated failure: GSS initiate failed
 at org.apache.thrift.transport.TSaslServerTransport$Factory.getTransport(TSaslServerTransport.java:219)
 at org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge$Server$TUGIAssumingTransportFactory$1.run(HadoopThriftAuthBridge.java:739)
 at org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge$Server$TUGIAssumingTransportFactory$1.run(HadoopThriftAuthBridge.java:736)
 at java.security.AccessController.doPrivileged(Native Method)
 at javax.security.auth.Subject.doAs(Subject.java:360)
 at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1637)
 at org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge$Server$TUGIAssumingTransportFactory.getTransport(HadoopThriftAuthBridge.java:736)
 at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:268)
 at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
 at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
 at java.lang.Thread.run(Thread.java:745)
Caused by: org.apache.thrift.transport.TTransportException: Peer indicated failure: GSS initiate failed
 at org.apache.thrift.transport.TSaslTransport.receiveSaslMessage(TSaslTransport.java:199)
 at org.apache.thrift.transport.TSaslServerTransport.handleSaslStartMessage(TSaslServerTransport.java:125)
 at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:271)
 at org.apache.thrift.transport.TSaslServerTransport.open(TSaslServerTransport.java:41)
 at org.apache.thrift.transport.TSaslServerTransport$Factory.getTransport(TSaslServerTransport.java:216)
 ... 10 more&lt;/P&gt;</description>
      <pubDate>Mon, 14 Mar 2016 23:27:44 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/HCatalog-and-kerberos/m-p/117656#M22266</guid>
      <dc:creator>rjotwani-211490215</dc:creator>
      <dc:date>2016-03-14T23:27:44Z</dc:date>
    </item>
    <item>
      <title>Re: HCatalog and kerberos</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/HCatalog-and-kerberos/m-p/117657#M22267</link>
      <description>&lt;A rel="user" href="https://community.cloudera.com/users/3300/rjotwani.html" nodeid="3300"&gt;@Rachna Bakhru&lt;/A&gt;&lt;P&gt;Please see this. &lt;A href="https://community.hortonworks.com/content/kbentry/17648/access-kerberos-cluster-from-java-using-cached-tic.html" target="_blank"&gt;https://community.hortonworks.com/content/kbentry/17648/access-kerberos-cluster-from-java-using-cached-tic.html&lt;/A&gt;&lt;/P&gt;&lt;P&gt;Do you kave keytab file for user that will be accessing cluster ? If yes, then you can use alternate approach of passing keytab and jaas file.&lt;/P&gt;</description>
      <pubDate>Mon, 14 Mar 2016 23:58:22 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/HCatalog-and-kerberos/m-p/117657#M22267</guid>
      <dc:creator>shishir_saxena4</dc:creator>
      <dc:date>2016-03-14T23:58:22Z</dc:date>
    </item>
    <item>
      <title>Re: HCatalog and kerberos</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/HCatalog-and-kerberos/m-p/117658#M22268</link>
      <description>&lt;P&gt;&lt;/P&gt;&lt;P&gt;Yes we do have the keytab file.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;hcatConf.setVar(HiveConf.ConfVars.&lt;STRONG&gt;&lt;EM&gt;METASTORE_KERBEROS_KEYTAB_FILE&lt;/EM&gt;&lt;/STRONG&gt;, keytab);&lt;/P&gt;&lt;P&gt;Now we get this error.&lt;/P&gt;&lt;P&gt;2016-03-14 13:32:35,223 ERROR
[pool-5-thread-2]: server.TThreadPoolServer (TThreadPoolServer.java:run(296)) -
Error occurred during processing of message.&lt;/P&gt;&lt;P&gt;java.lang.RuntimeException:
org.apache.thrift.transport.TTransportException: Invalid status -128&lt;/P&gt;&lt;P&gt; 
 at
org.apache.thrift.transport.TSaslServerTransport$Factory.getTransport(TSaslServerTransport.java:219)&lt;/P&gt;&lt;P&gt; 
at org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge$Server$TUGIAssumingTransportFactory$1.run(HadoopThriftAuthBridge.java:739)&lt;/P&gt;&lt;P&gt; 
at
org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge$Server$TUGIAssumingTransportFactory$1.run(HadoopThriftAuthBridge.java:736)&lt;/P&gt;&lt;P&gt; 
at java.security.AccessController.doPrivileged(Native Method)&lt;/P&gt;&lt;P&gt; 
at javax.security.auth.Subject.doAs(Subject.java:360)&lt;/P&gt;&lt;P&gt; 
at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1637)&lt;/P&gt;&lt;P&gt; 
at org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge$Server$TUGIAssumingTransportFactory.getTransport(HadoopThriftAuthBridge.java:736)&lt;/P&gt;&lt;P&gt; 
at
org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:268)&lt;/P&gt;&lt;P&gt; 
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)&lt;/P&gt;&lt;P&gt; 
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)&lt;/P&gt;&lt;P&gt; 
at java.lang.Thread.run(Thread.java:745)&lt;/P&gt;</description>
      <pubDate>Tue, 15 Mar 2016 01:47:21 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/HCatalog-and-kerberos/m-p/117658#M22268</guid>
      <dc:creator>rjotwani-211490215</dc:creator>
      <dc:date>2016-03-15T01:47:21Z</dc:date>
    </item>
    <item>
      <title>Re: HCatalog and kerberos</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/HCatalog-and-kerberos/m-p/117659#M22269</link>
      <description>&lt;P&gt;when I set &lt;/P&gt;&lt;P&gt;hcatConf.setVar(HiveConf.ConfVars.&lt;STRONG&gt;&lt;EM&gt;METASTORE_USE_THRIFT_SASL&lt;/EM&gt;&lt;/STRONG&gt;, "true");&lt;/P&gt;&lt;P&gt;I get this error&lt;/P&gt;&lt;P&gt;SEVERE: org/apache/commons/configuration/Configuration
java.lang.NoClassDefFoundError: org/apache/commons/configuration/Configuration
 at org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.&amp;lt;init&amp;gt;(DefaultMetricsSystem.java:38)
 at org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.&amp;lt;clinit&amp;gt;(DefaultMetricsSystem.java:36)
 at org.apache.hadoop.security.UserGroupInformation$UgiMetrics.create(UserGroupInformation.java:97)
 at org.apache.hadoop.security.UserGroupInformation.&amp;lt;clinit&amp;gt;(UserGroupInformation.java:190)
 at org.apache.hadoop.hive.shims.HadoopShimsSecure.getTokenStrForm(HadoopShimsSecure.java:455)
 at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:313)
 at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.&amp;lt;init&amp;gt;(HiveMetaStoreClient.java:214)
 at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.&amp;lt;init&amp;gt;(HiveMetaStoreClient.java:154)&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;shouldn't it use org.apache.hadoop.conf.Configuration?&lt;/P&gt;</description>
      <pubDate>Tue, 15 Mar 2016 01:55:24 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/HCatalog-and-kerberos/m-p/117659#M22269</guid>
      <dc:creator>rjotwani-211490215</dc:creator>
      <dc:date>2016-03-15T01:55:24Z</dc:date>
    </item>
    <item>
      <title>Re: HCatalog and kerberos</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/HCatalog-and-kerberos/m-p/117660#M22270</link>
      <description>&lt;P&gt;This error was resolved by adding the commons-configuration-.x.x.jar&lt;/P&gt;</description>
      <pubDate>Tue, 15 Mar 2016 03:32:29 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/HCatalog-and-kerberos/m-p/117660#M22270</guid>
      <dc:creator>rjotwani-211490215</dc:creator>
      <dc:date>2016-03-15T03:32:29Z</dc:date>
    </item>
    <item>
      <title>Re: HCatalog and kerberos</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/HCatalog-and-kerberos/m-p/117661#M22271</link>
      <description>&lt;P&gt;&lt;A rel="user" href="https://community.cloudera.com/users/3300/rjotwani.html" nodeid="3300"&gt;@Rachna Bakhru&lt;/A&gt; Is your problem fully resolved now ? As I understand, you made 2 changes in code &lt;/P&gt;&lt;PRE&gt;hcatConf.setVar(HiveConf.ConfVars.&lt;STRONG&gt;&lt;EM&gt;METASTORE_KERBEROS_KEYTAB_FILE&lt;/EM&gt;&lt;/STRONG&gt;, keytab);
hcatConf.setVar(HiveConf.ConfVars.&lt;STRONG&gt;&lt;EM&gt;METASTORE_USE_THRIFT_SASL&lt;/EM&gt;&lt;/STRONG&gt;, "true");&lt;/PRE&gt;&lt;P&gt;and added commons-configuration-.x.x.jar to your classes. Can you confirm, so this question can be closed.&lt;/P&gt;</description>
      <pubDate>Tue, 15 Mar 2016 04:04:35 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/HCatalog-and-kerberos/m-p/117661#M22271</guid>
      <dc:creator>shishir_saxena4</dc:creator>
      <dc:date>2016-03-15T04:04:35Z</dc:date>
    </item>
    <item>
      <title>Re: HCatalog and kerberos</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/HCatalog-and-kerberos/m-p/117662#M22272</link>
      <description>&lt;P&gt;&lt;/P&gt;&lt;P&gt;We are currently getting this error...&lt;/P&gt;&lt;P&gt;16:28:11,820  INFO metastore:297 - Trying to connect to metastore with URI thrift://192.168.42.154:9083
16:28:11,851 ERROR TSaslTransport:296 - SASL negotiation failure
javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]
 at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:212)
 at org.apache.thrift.transport.TSaslClientTransport.handleSaslStartMessage(TSaslClientTransport.java:94)
 at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:253)
 at org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37)
 at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:52)
 at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:49)
 at java.security.AccessController.doPrivileged(Native Method)
 at javax.security.auth.Subject.doAs(Subject.java:415)
 at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1408)
 at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49)
 at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:336)
 at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.&amp;lt;init&amp;gt;(HiveMetaStoreClient.java:214)
 at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.&amp;lt;init&amp;gt;(HiveMetaStoreClient.java:154)
 ......
Caused by: GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)
 at sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:147)
 at sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:121)
 at sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:187)
 at sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:223)
 at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:212)
 at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179)
 at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:193)&lt;/P&gt;</description>
      <pubDate>Tue, 15 Mar 2016 04:14:00 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/HCatalog-and-kerberos/m-p/117662#M22272</guid>
      <dc:creator>rjotwani-211490215</dc:creator>
      <dc:date>2016-03-15T04:14:00Z</dc:date>
    </item>
    <item>
      <title>Re: HCatalog and kerberos</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/HCatalog-and-kerberos/m-p/117663#M22273</link>
      <description>&lt;P&gt;No the problem isn't resolved yet.&lt;/P&gt;</description>
      <pubDate>Tue, 15 Mar 2016 04:15:14 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/HCatalog-and-kerberos/m-p/117663#M22273</guid>
      <dc:creator>rjotwani-211490215</dc:creator>
      <dc:date>2016-03-15T04:15:14Z</dc:date>
    </item>
    <item>
      <title>Re: HCatalog and kerberos</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/HCatalog-and-kerberos/m-p/117664#M22274</link>
      <description>&lt;P&gt;Try these additional things. Create a jaas file with following configuration. And launch your java program with these additional options. &lt;/P&gt;&lt;PRE&gt;Client { 
com.sun.security.auth.module.Krb5LoginModule required 
useKeyTab=true 
useTicketCache=false 
renewTicket=true };
&lt;/PRE&gt;&lt;PRE&gt;-Djava.security.auth.login.config="path-to-jaas-file" -Djava.security.krb5.conf="path-to-krb5.conf"&lt;/PRE&gt;</description>
      <pubDate>Tue, 15 Mar 2016 05:44:37 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/HCatalog-and-kerberos/m-p/117664#M22274</guid>
      <dc:creator>shishir_saxena4</dc:creator>
      <dc:date>2016-03-15T05:44:37Z</dc:date>
    </item>
    <item>
      <title>Re: HCatalog and kerberos</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/HCatalog-and-kerberos/m-p/117665#M22275</link>
      <description>&lt;P&gt;I am running my program from a windows machine.&lt;/P&gt;&lt;P&gt;I used&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;-Djava.security.auth.login.config="path-to-jaas-file" -Djava.security.krb5.conf="path-to-krb5.ini"&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;SEVERE: Error creating Hive objects: Could not connect to meta store using any of the URIs provided. Most recent failure: org.apache.thrift.transport.TTransportException: GSS initiate failed
 at org.apache.thrift.transport.TSaslTransport.sendAndThrowMessage(TSaslTransport.java:221)
 at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:297)
 at org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37)
 at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:52)
 at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:49)
 at java.security.AccessController.doPrivileged(Native Method)
 at javax.security.auth.Subject.doAs(Subject.java:415)
 at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1408)
 at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49)
 at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:336)
 at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.&amp;lt;init&amp;gt;(HiveMetaStoreClient.java:214)
 at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.&amp;lt;init&amp;gt;(HiveMetaStoreClient.java:154)&lt;/P&gt;&lt;P&gt;Error in hivemetastore.log&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;2016-03-16 13:31:09,808 ERROR [pool-5-thread-200]: server.TThreadPoolServer (TThreadPoolServer.java:run(296)) - Error occurred during processing of message.
java.lang.RuntimeException: org.apache.thrift.transport.TTransportException: Peer indicated failure: GSS initiate failed
 at org.apache.thrift.transport.TSaslServerTransport$Factory.getTransport(TSaslServerTransport.java:219)
 at org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge$Server$TUGIAssumingTransportFactory$1.run(HadoopThriftAuthBridge.java:739)
 at org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge$Server$TUGIAssumingTransportFactory$1.run(HadoopThriftAuthBridge.java:736)
 at java.security.AccessController.doPrivileged(Native Method)
 at javax.security.auth.Subject.doAs(Subject.java:360)
 at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1637)
 at org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge$Server$TUGIAssumingTransportFactory.getTransport(HadoopThriftAuthBridge.java:736)
 at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:268)
 at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
 at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
 at java.lang.Thread.run(Thread.java:745)
Caused by: org.apache.thrift.transport.TTransportException: Peer indicated failure: GSS initiate failed
 at org.apache.thrift.transport.TSaslTransport.receiveSaslMessage(TSaslTransport.java:199)
 at org.apache.thrift.transport.TSaslServerTransport.handleSaslStartMessage(TSaslServerTransport.java:125)
 at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:271)
 at org.apache.thrift.transport.TSaslServerTransport.open(TSaslServerTransport.java:41)
 at org.apache.thrift.transport.TSaslServerTransport$Factory.getTransport(TSaslServerTransport.java:216)&lt;/P&gt;</description>
      <pubDate>Thu, 17 Mar 2016 02:42:29 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/HCatalog-and-kerberos/m-p/117665#M22275</guid>
      <dc:creator>rjotwani-211490215</dc:creator>
      <dc:date>2016-03-17T02:42:29Z</dc:date>
    </item>
    <item>
      <title>Re: HCatalog and kerberos</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/HCatalog-and-kerberos/m-p/117666#M22276</link>
      <description>&lt;A rel="user" href="https://community.cloudera.com/users/3300/rjotwani.html" nodeid="3300"&gt;@Rachna Bakhru&lt;/A&gt;&lt;P&gt;Please reply in comments if it is not a new answer. &lt;/P&gt;&lt;P&gt;Here is some sample code to connect to a Kerberized cluster from JAVA program on Windows machine. Your steps should be similar.&lt;/P&gt;&lt;P&gt;1. Copy krb5.conf file to your Windows machine.&lt;/P&gt;&lt;P&gt;2. Copy different resource files to Windows. (core-site.xml,yarn-site.xml, hdfs-site.xml, hive-site.xml )&lt;/P&gt;&lt;P&gt;3. create a . jaas file with following configuration.&lt;/P&gt;&lt;PRE&gt;Client { 
com.sun.security.auth.module.Krb5LoginModule required 
useKeyTab=true 
useTicketCache=false 
renewTicket=true };
&lt;/PRE&gt;&lt;PRE&gt;Change your login code as follows.

conf = new org.apache.hadoop.conf.Configuration(); 
try{ 
String principal = "&amp;lt;principal&amp;gt;"; 
String keytab = "&amp;lt;keytab location&amp;gt;";
 
conf.set("hadoop.security.authentication", "Kerberos"); 
conf.addResource(new Path("./core-site.xml")); 
conf.addResource(new Path("./yarn-site.xml")); 
conf.addResource(new Path("./hdfs-site.xml")); 
UserGroupInformation.setConfiguration(conf); 
UserGroupInformation.loginUserFromKeytab(principal, keytab);&lt;/PRE&gt;&lt;P&gt;5. Then launch JAVA program with following parameters and specify paths for krb5.conf and .jaas file.&lt;/P&gt;&lt;PRE&gt;-Djava.security.auth.login.config="path-to-jaas-file" -Djava.security.krb5.conf="path-to-krb5.conf"&lt;/PRE&gt;</description>
      <pubDate>Thu, 17 Mar 2016 07:59:44 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/HCatalog-and-kerberos/m-p/117666#M22276</guid>
      <dc:creator>shishir_saxena4</dc:creator>
      <dc:date>2016-03-17T07:59:44Z</dc:date>
    </item>
    <item>
      <title>Re: HCatalog and kerberos</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/HCatalog-and-kerberos/m-p/117667#M22277</link>
      <description>&lt;P&gt;&lt;A rel="user" href="https://community.cloudera.com/users/2820/shishirsaxena3.html" nodeid="2820"&gt;@Shishir Saxena&lt;/A&gt; &lt;/P&gt;&lt;P&gt;Do I keep the original properties?&lt;/P&gt;&lt;PRE&gt;package com.dag.mc.biz.activelinx.emf.snapshot.hadoop;



//import javax.jdo.JDOException;

import java.util.List;



import org.apache.hadoop.conf.Configuration;

import org.apache.hadoop.fs.Path;

import org.apache.hadoop.hive.conf.HiveConf;

import org.apache.hadoop.hive.metastore.HiveMetaStoreClient;

import org.apache.hadoop.security.UserGroupInformation;

import org.apache.hive.hcatalog.api.HCatClient;

import org.apache.hive.hcatalog.api.HCatTable;

import org.apache.hive.hcatalog.common.HCatConstants;

import org.apache.hive.hcatalog.common.HCatException;



public class ListDBs {



	/**

	 * @param args

	 */

	public static void main(String[] args) {

		HCatClient hcatClient = null;

		try {



			String principal = "hive/_HOST@EXAMPLE.COM"; 

			String keytab = "&amp;lt;keytab location&amp;gt;";



			HiveConf hcatConf = new HiveConf();

			hcatConf.setVar(HiveConf.ConfVars.METASTOREURIS, "thrift://192.168.42.154:9083");

			hcatConf.set("hadoop.security.authentication", "Kerberos"); 

			hcatConf.set(HCatConstants.HCAT_HIVE_CLIENT_DISABLE_CACHE, "true");

			hcatConf.addResource(new Path("c:/temp/hive-site.xml")); 

			hcatConf.setVar(HiveConf.ConfVars.METASTORE_KERBEROS_PRINCIPAL, principal);

			hcatConf.setVar(HiveConf.ConfVars.METASTORE_KERBEROS_KEYTAB_FILE, keytab);

			hcatConf.setVar(HiveConf.ConfVars.METASTORE_USE_THRIFT_SASL, "true");

			hcatClient = HCatClient.create(new Configuration(hcatConf));



			

			UserGroupInformation.setConfiguration(hcatConf); 

			UserGroupInformation.loginUserFromKeytab(principal, keytab);



			HiveMetaStoreClient hiveMetastoreClient = new HiveMetaStoreClient(hcatConf);

			List&amp;lt;String&amp;gt; dbs = hcatClient.listDatabaseNamesByPattern("*");

			for (String db : dbs) {

				System.out.println(db);

				List&amp;lt;String&amp;gt; tables = hcatClient.listTableNamesByPattern(db, "*");

				for (String tableString: tables) {

					HCatTable tbl = hcatClient.getTable(db, tableString);

					String tableType = tbl.getTabletype();

					String tableName = tbl.getTableName();

					if (tableType.equalsIgnoreCase("View")) {

						org.apache.hadoop.hive.metastore.api.Table viewMetastoreObject = hiveMetastoreClient.getTable(db, tableName);

						String sql = viewMetastoreObject.getViewOriginalText();

						System.out.println(sql);

					}

				}

			}

			

			



		} catch (Throwable t) {

			t.printStackTrace();

		} finally {

			if (hcatClient != null)

				try {

					hcatClient.close();

				} catch (HCatException e) {

				}

		}

	}

}&lt;/PRE&gt;</description>
      <pubDate>Tue, 22 Mar 2016 03:54:55 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/HCatalog-and-kerberos/m-p/117667#M22277</guid>
      <dc:creator>rjotwani-211490215</dc:creator>
      <dc:date>2016-03-22T03:54:55Z</dc:date>
    </item>
    <item>
      <title>Re: HCatalog and kerberos</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/HCatalog-and-kerberos/m-p/117668#M22278</link>
      <description>&lt;P&gt;&lt;/P&gt;&lt;P&gt;Current Error:&lt;/P&gt;&lt;P&gt;12:14:39,073 ERROR TSaslTransport:296 - SASL negotiation failure
javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]
 at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:212)
 at org.apache.thrift.transport.TSaslClientTransport.handleSaslStartMessage(TSaslClientTransport.java:94)
 at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:253)
 at org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37)
 at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:52)
 at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:49)
 at java.security.AccessController.doPrivileged(Native Method)
 at javax.security.auth.Subject.doAs(Subject.java:415)
 at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1408)
 at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49)
 at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:336)
 at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.&amp;lt;init&amp;gt;(HiveMetaStoreClient.java:214)
 at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.&amp;lt;init&amp;gt;(HiveMetaStoreClient.java:154)
 at org.apache.hive.hcatalog.common.HiveClientCache.getNonCachedHiveClient(HiveClientCache.java:80)
 at org.apache.hive.hcatalog.common.HCatUtil.getHiveClient(HCatUtil.java:557)
 at org.apache.hive.hcatalog.api.HCatClientHMSImpl.initialize(HCatClientHMSImpl.java:595)
 at org.apache.hive.hcatalog.api.HCatClient.create(HCatClient.java:66)
 at .....
Caused by: GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)
 at sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:147)
 at sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:121)
 at sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:187)
 at sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:223)
 at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:212)
 at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179)
 at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:193)
 ... 23 more&lt;/P&gt;</description>
      <pubDate>Tue, 22 Mar 2016 23:38:40 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/HCatalog-and-kerberos/m-p/117668#M22278</guid>
      <dc:creator>rjotwani-211490215</dc:creator>
      <dc:date>2016-03-22T23:38:40Z</dc:date>
    </item>
    <item>
      <title>Re: HCatalog and kerberos</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/HCatalog-and-kerberos/m-p/117669#M22279</link>
      <description>&lt;P&gt;The below code worked.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&lt;A href="https://community.hortonworks.com/users/2820/shishirsaxena3.html"&gt;@Shishir Saxena&lt;/A&gt;&lt;/P&gt;&lt;PRE&gt;package hadoop.test;&lt;/PRE&gt;&lt;P&gt;
&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;PRE&gt;import java.util.List;&lt;/PRE&gt;&lt;P&gt;&lt;STRONG&gt;&lt;/STRONG&gt;
&lt;/P&gt;&lt;PRE&gt;import org.apache.hadoop.conf.Configuration;&lt;/PRE&gt;&lt;P&gt;&lt;STRONG&gt;&lt;/STRONG&gt;
&lt;/P&gt;&lt;PRE&gt;import org.apache.hadoop.hive.conf.HiveConf;&lt;/PRE&gt;&lt;P&gt;&lt;STRONG&gt;&lt;/STRONG&gt;
&lt;/P&gt;&lt;PRE&gt;import org.apache.hadoop.hive.metastore.HiveMetaStoreClient;&lt;/PRE&gt;&lt;PRE&gt;import org.apache.hadoop.security.UserGroupInformation;&lt;/PRE&gt;&lt;PRE&gt;import org.apache.hive.hcatalog.api.HCatClient;&lt;/PRE&gt;&lt;PRE&gt;import org.apache.hive.hcatalog.api.HCatTable;&lt;/PRE&gt;&lt;PRE&gt;import org.apache.hive.hcatalog.common.HCatConstants;&lt;/PRE&gt;&lt;PRE&gt;import org.apache.hive.hcatalog.common.HCatException;&lt;/PRE&gt;&lt;PRE&gt;import org.apache.hive.hcatalog.data.schema.HCatFieldSchema;&lt;/PRE&gt;&lt;PRE&gt;import org.apache.hive.hcatalog.data.schema.HCatSchema;&lt;/PRE&gt;&lt;P&gt;&lt;STRONG&gt;&lt;/STRONG&gt;
&lt;/P&gt;&lt;PRE&gt;public class ListDBs1 {&lt;/PRE&gt;&lt;P&gt;
&lt;/P&gt;&lt;PRE&gt;publicstaticvoid main(String[] args) {&lt;/PRE&gt;&lt;PRE&gt;		HCatClient hcatClient = null;&lt;/PRE&gt;&lt;PRE&gt;try {&lt;/PRE&gt;&lt;PRE&gt;			String principal ="hive/quickstart.cloudera@XXX.COM"; &lt;/PRE&gt;&lt;PRE&gt;			String keytab = "E:\\apps\\metacenter_home\\hadoop\\hive.keytab";&lt;/PRE&gt;&lt;PRE&gt;			System.setProperty("sun.security.krb5.debug", "true");&lt;/PRE&gt;&lt;PRE&gt;			System.setProperty("java.security.krb5.conf", "E:\\apps\\hadoop\\krb5.conf");&lt;/PRE&gt;&lt;PRE&gt;			System.setProperty("java.security.auth.login.config", "E:\\apps\\hadoop\\jaas.conf");&lt;/PRE&gt;&lt;PRE&gt;			HiveConf hcatConf = new HiveConf();&lt;/PRE&gt;&lt;PRE&gt;hcatConf.setVar(HiveConf.ConfVars.METASTOREURIS, "thrift://server:9083");&lt;/PRE&gt;&lt;PRE&gt;hcatConf.set("hadoop.security.authentication", "kerberos"); &lt;/PRE&gt;&lt;PRE&gt;hcatConf.set(HCatConstants.HCAT_HIVE_CLIENT_DISABLE_CACHE, "true");&lt;/PRE&gt;&lt;PRE&gt;hcatConf.setVar(HiveConf.ConfVars.METASTORE_KERBEROS_PRINCIPAL, principal);&lt;/PRE&gt;&lt;PRE&gt;hcatConf.setVar(HiveConf.ConfVars.METASTORE_KERBEROS_KEYTAB_FILE, keytab);&lt;/PRE&gt;&lt;PRE&gt;hcatConf.setVar(HiveConf.ConfVars.METASTORE_USE_THRIFT_SASL, "true");&lt;/PRE&gt;&lt;PRE&gt;			UserGroupInformation.setConfiguration(hcatConf); &lt;/PRE&gt;&lt;PRE&gt;			UserGroupInformation.loginUserFromKeytab(principal, keytab);&lt;/PRE&gt;&lt;PRE&gt;hcatClient = HCatClient.create(new Configuration(hcatConf));&lt;/PRE&gt;&lt;PRE&gt;			HiveMetaStoreClient hiveMetastoreClient = new HiveMetaStoreClient(hcatConf);&lt;/PRE&gt;&lt;PRE&gt;			list(hcatClient,hiveMetastoreClient);&lt;/PRE&gt;&lt;PRE&gt;		} catch (Throwable t) {&lt;/PRE&gt;&lt;PRE&gt;t.printStackTrace();&lt;/PRE&gt;&lt;PRE&gt;		} finally {&lt;/PRE&gt;&lt;PRE&gt;if (hcatClient != null)&lt;/PRE&gt;&lt;PRE&gt;try {&lt;/PRE&gt;&lt;PRE&gt;hcatClient.close();&lt;/PRE&gt;&lt;PRE&gt;				} catch (HCatException e) {&lt;/PRE&gt;&lt;PRE&gt;				}&lt;/PRE&gt;&lt;PRE&gt;		}&lt;/PRE&gt;&lt;PRE&gt;	}&lt;/PRE&gt;&lt;PRE&gt;
&lt;/PRE&gt;&lt;PRE&gt;privatestaticvoid list(HCatClient hcatClient, HiveMetaStoreClient hiveMetastoreClient) throws Exception {&lt;/PRE&gt;&lt;PRE&gt;		List&amp;lt;String&amp;gt; dbs = hcatClient.listDatabaseNamesByPattern("*");&lt;/PRE&gt;&lt;PRE&gt;for (String db : dbs) {&lt;/PRE&gt;&lt;PRE&gt;			System.out.println(db);&lt;/PRE&gt;&lt;PRE&gt;			List&amp;lt;String&amp;gt; tables = hcatClient.listTableNamesByPattern(db, "*");&lt;/PRE&gt;&lt;PRE&gt;for (String tableString: tables) {&lt;/PRE&gt;&lt;PRE&gt;				HCatTable tbl = hcatClient.getTable(db, tableString);&lt;/PRE&gt;&lt;PRE&gt;				String tableType = tbl.getTabletype();&lt;/PRE&gt;&lt;PRE&gt;				String tableName = tbl.getTableName();&lt;/PRE&gt;&lt;PRE&gt;				System.out.println(tableType + " - " + tableName);&lt;/PRE&gt;&lt;PRE&gt;				System.out.println("Table Name is: " + tableName);&lt;/PRE&gt;&lt;PRE&gt;                System.out.println("Table Type is: " + tbl.getTabletype());&lt;/PRE&gt;&lt;PRE&gt;                System.out.println("Table Props are: " + tbl.getTblProps());&lt;/PRE&gt;&lt;PRE&gt;                List&amp;lt;HCatFieldSchema&amp;gt; fields = tbl.getCols();&lt;/PRE&gt;&lt;PRE&gt;for (HCatFieldSchema f: fields) {&lt;/PRE&gt;&lt;PRE&gt;                      System.out.println("Field Name is: " + f.getName());&lt;/PRE&gt;&lt;PRE&gt;                      System.out.println("Field Type String is: " + f.getTypeString());&lt;/PRE&gt;&lt;PRE&gt;                      System.out.println("Field Type Category is: " + f.getTypeString());&lt;/PRE&gt;&lt;PRE&gt;if (f.getCategory().equals(HCatFieldSchema.Category.STRUCT)) {&lt;/PRE&gt;&lt;PRE&gt;                            HCatSchema schema = f.getStructSubSchema();&lt;/PRE&gt;&lt;PRE&gt;                            List&amp;lt;String&amp;gt; structFields = schema.getFieldNames();&lt;/PRE&gt;&lt;PRE&gt;for (String fieldName: structFields) {&lt;/PRE&gt;&lt;PRE&gt;                                  System.out.println("Struct Field Name is: " + fieldName);                                               &lt;/PRE&gt;&lt;PRE&gt;                            }&lt;/PRE&gt;&lt;PRE&gt;                      }&lt;/PRE&gt;&lt;PRE&gt;                }  &lt;/PRE&gt;&lt;PRE&gt;
&lt;/PRE&gt;&lt;PRE&gt;if (tableType.equalsIgnoreCase("View") || tableType.equalsIgnoreCase("VIRTUAL_VIEW")) {&lt;/PRE&gt;&lt;PRE&gt;					org.apache.hadoop.hive.metastore.api.Table viewMetastoreObject = hiveMetastoreClient.getTable(db, tableName);&lt;/PRE&gt;&lt;PRE&gt;					String sql = viewMetastoreObject.getViewOriginalText();&lt;/PRE&gt;&lt;PRE&gt;					System.out.println(sql);&lt;/PRE&gt;&lt;PRE&gt;				}&lt;/PRE&gt;&lt;PRE&gt;			}&lt;/PRE&gt;&lt;PRE&gt;		}&lt;/PRE&gt;&lt;PRE&gt;	}&lt;/PRE&gt;&lt;PRE&gt;}&lt;/PRE&gt;</description>
      <pubDate>Wed, 06 Jul 2016 02:47:27 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/HCatalog-and-kerberos/m-p/117669#M22279</guid>
      <dc:creator>rjotwani-211490215</dc:creator>
      <dc:date>2016-07-06T02:47:27Z</dc:date>
    </item>
  </channel>
</rss>

