Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

HiveMetaStoreClient fails to connect to a Kerberized cluster

HiveMetaStoreClient fails to connect to a Kerberized cluster

Expert Contributor

Kerberized HDP-2.6.3.0.

**********Start-Edit-1**********

I commented the created an 'uber' jar and executed the above class directly on one of the metastore servers(Linux machines), yet, the error persists.

**********End-Edit-1**********

I have a test code running on my local Windows 7 machine. Note the commented code, as well, that's not making a difference.

<code>private static void connectHiveMetastore() throws MetaException, MalformedURLException {

        System.setProperty("hadoop.home.dir", "E:\\Development\\Software\\Virtualization");

        /*Start : Commented or un-commented, immaterial ...*/
        System.setProperty("javax.security.auth.useSubjectCredsOnly","false");
        System.setProperty("java.security.auth.login.config","E:\\Development\\lib\\hdp\\loginconf.ini");
        System.setProperty("java.security.krb5.conf","E:\\Development\\lib\\hdp\\krb5.conf");
        /*End : Commented or un-commented, immaterial ...*/

        Configuration configuration = new Configuration();
        /*Start : Commented or un-commented, immaterial ...*/
        //configuration.addResource("E:\\Development\\lib\\hdp\\client_config\\HDFS_CLIENT\\core-site.xml");
        //configuration.addResource("E:\\Development\\lib\\hdp\\client_config\\HDFS_CLIENT\\hdfs-site.xml");
        //configuration.addResource("E:\\Development\\lib\\hdp\\client_config\\HIVE_CLIENT\\hive-site.xml");
        //configuration.set("hive.server2.authentication","KERBEROS");
        //configuration.set("hadoop.security.authentication", "Kerberos");
        /*End : Commented or un-commented, immaterial ...*/

        HiveConf hiveConf = new HiveConf(configuration,Configuration.class);
        /*Start : Commented or un-commented, immaterial ...*/
        /*URL url = new File("E:\\Development\\lib\\hdp\\client_config\\HIVE_CLIENT\\hive-site.xml").toURI().toURL();
        HiveConf.setHiveSiteLocation(url);*/
        //hiveConf.setVar(HiveConf.ConfVars.HIVE_SERVER2_AUTHENTICATION,"KERBEROS");

        /*End : Commented or un-commented, immaterial ...*/

        hiveConf.setVar(HiveConf.ConfVars.METASTOREURIS,"thrift://l4283t.sss.se.com:9083,thrift://l4284t.sss.se.com:9083");
        HiveMetaStoreClient hiveMetaStoreClient = new HiveMetaStoreClient(hiveConf);

        System.out.println("Metastore client : "+hiveMetaStoreClient);
        System.out.println("Is local metastore ? "+hiveMetaStoreClient.isLocalMetaStore());
        System.out.println(hiveMetaStoreClient.getAllDatabases());

        hiveMetaStoreClient.close();
    }

The output I receive on the local machine(Note 'Is local metastore ? false'):

<code>SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/E:/Development/lib/hdp/hadoop/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/E:/Development/lib/hdp/hive2/lib/log4j-slf4j-impl-2.6.2.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
2017-11-28 14:20:14,631 INFO  [main] hive.metastore (HiveMetaStoreClient.java:open(443)) - Trying to connect to metastore with URI thrift://l4284t.sss.se.com:9083
2017-11-28 14:20:14,873 WARN  [main] util.NativeCodeLoader (NativeCodeLoader.java:<clinit>(62)) - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2017-11-28 14:20:15,005 WARN  [main] security.ShellBasedUnixGroupsMapping (ShellBasedUnixGroupsMapping.java:getUnixGroups(87)) - got exception trying to get groups for user ojoqcu: Incorrect command line arguments.
2017-11-28 14:20:15,042 WARN  [main] hive.metastore (HiveMetaStoreClient.java:open(511)) - set_ugi() not successful, Likely cause: new client talking to old server. Continuing without it.
org.apache.thrift.transport.TTransportException
    at org.apache.thrift.transport.TIOStreamTransport.read(TIOStreamTransport.java:132)
    at org.apache.thrift.transport.TTransport.readAll(TTransport.java:86)
    at org.apache.thrift.protocol.TBinaryProtocol.readStringBody(TBinaryProtocol.java:380)
    at org.apache.thrift.protocol.TBinaryProtocol.readMessageBegin(TBinaryProtocol.java:230)
    at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:77)
    at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_set_ugi(ThriftHiveMetastore.java:3802)
    at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.set_ugi(ThriftHiveMetastore.java:3788)
    at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:503)
    at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:282)
    at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:188)
    at MetaStoreClient.connectHiveMetastore(MetaStoreClient.java:47)
    at MetaStoreClient.main(MetaStoreClient.java:14)
2017-11-28 14:20:15,045 INFO  [main] hive.metastore (HiveMetaStoreClient.java:open(539)) - Connected to metastore.
Metastore client : org.apache.hadoop.hive.metastore.HiveMetaStoreClient@5fdcaa40
Is local metastore ? false
Exception in thread "main" MetaException(message:Got exception: org.apache.thrift.transport.TTransportException null)
    at org.apache.hadoop.hive.metastore.MetaStoreUtils.logAndThrowMetaException(MetaStoreUtils.java:1256)
    at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getAllDatabases(HiveMetaStoreClient.java:1129)
    at MetaStoreClient.connectHiveMetastore(MetaStoreClient.java:51)
    at MetaStoreClient.main(MetaStoreClient.java:14)
2017-11-28 14:20:15,057 ERROR [main] hive.log (MetaStoreUtils.java:logAndThrowMetaException(1254)) - Got exception: org.apache.thrift.transport.TTransportException null
org.apache.thrift.transport.TTransportException
    at org.apache.thrift.transport.TIOStreamTransport.read(TIOStreamTransport.java:132)
    at org.apache.thrift.transport.TTransport.readAll(TTransport.java:86)
    at org.apache.thrift.protocol.TBinaryProtocol.readAll(TBinaryProtocol.java:429)
    at org.apache.thrift.protocol.TBinaryProtocol.readI32(TBinaryProtocol.java:318)
    at org.apache.thrift.protocol.TBinaryProtocol.readMessageBegin(TBinaryProtocol.java:219)
    at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:77)
    at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_get_all_databases(ThriftHiveMetastore.java:771)
    at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.get_all_databases(ThriftHiveMetastore.java:759)
    at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getAllDatabases(HiveMetaStoreClient.java:1127)
    at MetaStoreClient.connectHiveMetastore(MetaStoreClient.java:51)
    at MetaStoreClient.main(MetaStoreClient.java:14)
2017-11-28 14:20:15,058 ERROR [main] hive.log (MetaStoreUtils.java:logAndThrowMetaException(1255)) - Converting exception to MetaException

Process finished with exit code 1

The metastore log throws:

<code>java.lang.RuntimeException: org.apache.thrift.transport.TTransportException: Invalid status -128
        at org.apache.thrift.transport.TSaslServerTransport$Factory.getTransport(TSaslServerTransport.java:219)
        at org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge$Server$TUGIAssumingTransportFactory$1.run(HadoopThriftAuthBridge.java:609)
        at org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge$Server$TUGIAssumingTransportFactory$1.run(HadoopThriftAuthBridge.java:606)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:360)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1846)
        at org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge$Server$TUGIAssumingTransportFactory.getTransport(HadoopThriftAuthBridge.java:606)
        at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:269)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
        at java.lang.Thread.run(Thread.java:745)
Caused by: org.apache.thrift.transport.TTransportException: Invalid status -128
        at org.apache.thrift.transport.TSaslTransport.sendAndThrowMessage(TSaslTransport.java:232)
        at org.apache.thrift.transport.TSaslTransport.receiveSaslMessage(TSaslTransport.java:184)
        at org.apache.thrift.transport.TSaslServerTransport.handleSaslStartMessage(TSaslServerTransport.java:125)
        at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:271)
        at org.apache.thrift.transport.TSaslServerTransport.open(TSaslServerTransport.java:41)
        at org.apache.thrift.transport.TSaslServerTransport$Factory.getTransport(TSaslServerTransport.java:216)

In the client-configs(which I downloaded from the cluster via Ambari) as well as the /etc/hive2/conf/hive-site.xml, /etc/hive/conf/conf.server/hive-site.xml, hive-metastore/conf/hive-site.xml have the same following property:

<code><property>
 <name>hive.metastore.sasl.enabled</name>
 <value>true</value>
</property>
<property>
 <name>hive.server2.thrift.sasl.qop</name>
 <value>auth</value>
</property>

For reference, providing relevant parts of the client config hive-site.xml:

<code><configuration>

    <property>
      <name>ambari.hive.db.schema.name</name>
      <value>hive_metastore</value>
    </property>

    <property>
      <name>atlas.rest.address</name>
      <value>https://l4284t.sss.se.com:21443</value>
    </property>
<property>
      <name>hive.server2.thrift.http.path</name>
      <value>cliservice</value>
    </property>

    <property>
      <name>hive.server2.thrift.http.port</name>
      <value>10001</value>
    </property>

    <property>
      <name>hive.server2.thrift.max.worker.threads</name>
      <value>500</value>
    </property>

    <property>
      <name>hive.server2.thrift.port</name>
      <value>10000</value>
    </property>

    <property>
      <name>hive.server2.thrift.sasl.qop</name>
      <value>auth</value>
    </property>

    <property>
      <name>hive.server2.transport.mode</name>
      <value>http</value>
    </property>

    <property>
      <name>hive.server2.use.SSL</name>
      <value>false</value>
    </property>
<property>
      <name>hive.metastore.kerberos.keytab.file</name>
      <value>/etc/security/keytabs/hive.service.keytab</value>
    </property>

    <property>
      <name>hive.metastore.kerberos.principal</name>
      <value>hive/_HOST@GLOBAL.SCD.COM</value>
    </property>

    <property>
      <name>hive.metastore.pre.event.listeners</name>
      <value>org.apache.hadoop.hive.ql.security.authorization.AuthorizationPreEventListener</value>
    </property>

    <property>
      <name>hive.metastore.sasl.enabled</name>
      <value>true</value>
    </property>
<property>
      <name>hive.metastore.uris</name>
      <value>thrift://l4283t.sss.se.com:9083,thrift://l4284t.sss.se.com:9083</value>
    </property>
<property>
      <name>hive.security.metastore.authenticator.manager</name>
      <value>org.apache.hadoop.hive.ql.security.HadoopDefaultMetastoreAuthenticator</value>
    </property>
<property>
      <name>hive.server2.authentication</name>
      <value>KERBEROS</value>
    </property>

    <property>
      <name>hive.server2.authentication.kerberos.keytab</name>
      <value>/etc/security/keytabs/hive.service.keytab</value>
    </property>

    <property>
      <name>hive.server2.authentication.kerberos.principal</name>
      <value>hive/_HOST@GLOBAL.SCD.COM</value>
    </property>

    <property>
      <name>hive.server2.authentication.spnego.keytab</name>
      <value>/etc/security/keytabs/spnego.service.keytab</value>
    </property>

    <property>
      <name>hive.server2.authentication.spnego.principal</name>
      <value>HTTP/_HOST@GLOBAL.SCD.COM</value>
    </property>

    <property>
      <name>hive.server2.enable.doAs</name>
      <value>false</value>
    </property>
<property>
      <name>hive.server2.keystore.path</name>
      <value>/etc/security/ssl/default-key.jks</value>
    </property>

    <property>
      <name>hive.server2.logging.operation.enabled</name>
      <value>true</value>
    </property>
 <property>
      <name>javax.jdo.option.ConnectionDriverName</name>
      <value>com.mysql.jdbc.Driver</value>
    </property>

    <property>
      <name>javax.jdo.option.ConnectionURL</name>
      <value>jdbc:mysql://l4373t.sss.se.com/hive_metastore</value>
    </property>

    <property>
      <name>javax.jdo.option.ConnectionUserName</name>
      <value>hive</value>
    </property>

  </configuration>

Hive database connection via jdbc using the following lines of code works:

<code>System.setProperty("javax.security.auth.useSubjectCredsOnly","false");
System.setProperty("java.security.auth.login.config","E:\\Development\\lib\\hdp\\loginconf.ini");
System.setProperty("java.security.krb5.conf","E:\\Development\\lib\\hdp\\krb5.conf");
Don't have an account?
Coming from Hortonworks? Activate your account here