Support Questions

Find answers, ask questions, and share your expertise

beeline and kerberos

Rising Star

I am trying to use beeline with hive + kerberos (Hortonworks sandbox 2.3)

The problem is that I can use hdfs but not beeline and I do not know what is wrong.

Console output:

[margusja@sandbox ~]$ kdestroy

[margusja@sandbox ~]$ hdfs dfs -ls /user/

16/01/09 15:45:32 WARN ipc.Client: Exception encountered while connecting to the server : javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]

ls: Failed on local exception: java.io.IOException: javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]; Host Details : local host is: "sandbox.hortonworks.com/10.0.2.15"; destination host is: "sandbox.hortonworks.com":8020;

[margusja@sandbox ~]$ kinit margusja

Password for margusja@EXAMPLE.COM:

[margusja@sandbox ~]$ hdfs dfs -ls /user/

Found 11 items

drwxrwx--- - ambari-qa hdfs 0 2015-10-27 12:39 /user/ambari-qa

drwxr-xr-x - guest guest 0 2015-10-27 12:55 /user/guest

drwxr-xr-x - hcat hdfs 0 2015-10-27 12:43 /user/hcat

drwx------ - hdfs hdfs 0 2015-10-27 13:22 /user/hdfs

drwx------ - hive hdfs 0 2016-01-08 19:44 /user/hive

drwxrwxrwx - hue hdfs 0 2015-10-27 12:55 /user/hue

drwxrwxr-x - oozie hdfs 0 2015-10-27 12:44 /user/oozie

drwxr-xr-x - solr hdfs 0 2015-10-27 12:48 /user/solr

drwxrwxr-x - spark hdfs 0 2015-10-27 12:41 /user/spark

drwxr-xr-x - unit hdfs 0 2015-10-27 12:46 /user/unit

So I think margusja's credential is ok

[margusja@sandbox ~]$ klist -f Ticket cache: FILE:/tmp/krb5cc_1024 Default principal: margusja@EXAMPLE.COM Valid starting Expires Service principal 01/10/16 07:54:34 01/11/16 07:54:34 krbtgt/EXAMPLE.COM@EXAMPLE.COM renew until 01/17/16 07:54:34, Flags: FRI

Now I try to use beeline:

[margusja@sandbox ~]$ beeline -u "jdbc:hive2://127.0.0.1:10000/default;principal=hive/sandbox.hortonworks.com@EXAMPLE.COM"

SLF4J: Class path contains multiple SLF4J bindings.

SLF4J: Found binding in [jar:file:/usr/hdp/2.3.2.0-2950/spark/lib/spark-assembly-1.4.1.2.3.2.0-2950-hadoop2.7.1.2.3.2.0-2950.jar!/org/slf4j/impl/StaticLoggerBinder.class]

SLF4J: Found binding in [jar:file:/usr/hdp/2.3.2.0-2950/hadoop/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]

SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.

SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]

WARNING: Use "yarn jar" to launch YARN applications.

SLF4J: Class path contains multiple SLF4J bindings.

SLF4J: Found binding in [jar:file:/usr/hdp/2.3.2.0-2950/spark/lib/spark-assembly-1.4.1.2.3.2.0-2950-hadoop2.7.1.2.3.2.0-2950.jar!/org/slf4j/impl/StaticLoggerBinder.class]

SLF4J: Found binding in [jar:file:/usr/hdp/2.3.2.0-2950/hadoop/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]

SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.

SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]

Connecting to jdbc:hive2://127.0.0.1:10000/default;principal=hive/sandbox.hortonworks.com@EXAMPLE.COM

16/01/09 15:46:59 [main]: ERROR transport.TSaslTransport: SASL negotiation failure

javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]

at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:212)

at org.apache.thrift.transport.TSaslClientTransport.handleSaslStartMessage(TSaslClientTransport.java:94)

at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:271)

at org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37)

at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:52)

at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:49)

at java.security.AccessController.doPrivileged(Native Method)

at javax.security.auth.Subject.doAs(Subject.java:415)

at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)

at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49)

at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:210)

at org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:180)

at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:105)

at java.sql.DriverManager.getConnection(DriverManager.java:571)

at java.sql.DriverManager.getConnection(DriverManager.java:187)

at org.apache.hive.beeline.DatabaseConnection.connect(DatabaseConnection.java:142)

at org.apache.hive.beeline.DatabaseConnection.getConnection(DatabaseConnection.java:207)

at org.apache.hive.beeline.Commands.connect(Commands.java:1149)

at org.apache.hive.beeline.Commands.connect(Commands.java:1070)

at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)

at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

at java.lang.reflect.Method.invoke(Method.java:606)

at org.apache.hive.beeline.ReflectiveCommandHandler.execute(ReflectiveCommandHandler.java:52)

at org.apache.hive.beeline.BeeLine.dispatch(BeeLine.java:970)

at org.apache.hive.beeline.BeeLine.initArgs(BeeLine.java:707)

at org.apache.hive.beeline.BeeLine.begin(BeeLine.java:757)

at org.apache.hive.beeline.BeeLine.mainWithInputRedirection(BeeLine.java:484)

at org.apache.hive.beeline.BeeLine.main(BeeLine.java:467)

at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)

at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

at java.lang.reflect.Method.invoke(Method.java:606)

at org.apache.hadoop.util.RunJar.run(RunJar.java:221)

at org.apache.hadoop.util.RunJar.main(RunJar.java:136)

Caused by: GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)

at sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:147)

at sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:121)

at sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:187)

at sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:223)

at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:212)

at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179)

at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:193)

... 34 more

Error: Could not open client transport with JDBC Uri: jdbc:hive2://127.0.0.1:10000/default;principal=hive/sandbox.hortonworks.com@EXAMPLE.COM: GSS initiate failed (state=08S01,code=0)

Beeline version 1.2.1.2.3.2.0-2950 by Apache Hive

0: jdbc:hive2://127.0.0.1:10000/default (closed)>

Hive is configured as documentation requires:

<property>

<name>hive.server2.authentication</name>

<value>KERBEROS</value>

</property>

<property>

<name>hive.server2.authentication.kerberos.keytab</name>

<value>/etc/security/keytabs/hive.service.keytab</value>

</property>

<property>

<name>hive.server2.authentication.kerberos.principal</name>

<value>hive/_HOST@EXAMPLE.COM</value>

</property>

One more notice

When I do:

[margusja@sandbox ~]$ hdfs dfs -ls /

I see in krb5kdc log:

Jan 09 21:36:53 sandbox.hortonworks.com krb5kdc[8565](info): TGS_REQ (6 etypes {18 17 16 23 1 3}) 10.0.2.15: ISSUE: authtime 1452375310, etypes {rep=18 tkt=18 ses=18}, margusja@EXAMPLE.COM for nn/sandbox.hortonworks.com@EXAMPLE.COM

but when I use beeline I see there no lines in krb5kdc log.

When I do

[margusja@sandbox ~]$ kdestroy

and hdfs dfs -ls / - I see there no lines also in krb5kdc log.

I am so confused - What beeline expecting? I do kinit and I am getting ticket before using beeline.

Any hints, because I am out of ideas.

1 ACCEPTED SOLUTION

Rising Star

I do not know is it solution here but one helpful think is to enable kerberos debug mode to see what kerberos wants:

export HADOOP_OPTS="-Dsun.security.krb5.debug=true"

It helped me

View solution in original post

19 REPLIES 19

@Margus Roo

ERROR transport.TSaslTransport: SASL negotiation failure

javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]

kinit using hive keytab and see if you can login.

@Margus Roo Thanks for trying that.

Try this

beeline then press enter

!connect jdbc:hive2://localhost:10000/;principal=hive/sandbox.hortonworks.com@EXAMPLE.COM

@Margus Roo Also, are you able to login using hive cli?

Rising Star

Hi

[root@sandbox ~]# kinit -kt /etc/security/keytabs/hive.service.keytab hive/sandbox.hortonworks.com@EXAMPLE.COM
[root@sandbox ~]# klist
Ticket cache: FILE:/tmp/krb5cc_0
Default principal: hive/sandbox.hortonworks.com@EXAMPLE.COM
Valid starting     Expires            Service principal
01/10/16 12:21:27  01/11/16 12:21:27  krbtgt/EXAMPLE.COM@EXAMPLE.COM
renew until 01/17/16 12:21:27
Is it ok until now? Do I have valid ticket?
[root@sandbox ~]# beeline -u "jdbc:hive2://localhost:10000/;principal=hive/sandbox.hortonworks.com@EXAMPLE.COM" 
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/hdp/2.3.2.0-2950/spark/lib/spark-assembly-1.4.1.2.3.2.0-2950-hadoop2.7.1.2.3.2.0-2950.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/hdp/2.3.2.0-2950/hadoop/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
WARNING: Use "yarn jar" to launch YARN applications.
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/hdp/2.3.2.0-2950/spark/lib/spark-assembly-1.4.1.2.3.2.0-2950-hadoop2.7.1.2.3.2.0-2950.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/hdp/2.3.2.0-2950/hadoop/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
Connecting to jdbc:hive2://localhost:10000/;principal=hive/sandbox.hortonworks.com@EXAMPLE.COM
16/01/10 12:23:42 [main]: ERROR transport.TSaslTransport: SASL negotiation failure
javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]
at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:212)
at org.apache.thrift.transport.TSaslClientTransport.handleSaslStartMessage(TSaslClientTransport.java:94)
at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:271)
at org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37)
at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:52)
at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:49)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49)
at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:210)
at org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:180)
at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:105)
at java.sql.DriverManager.getConnection(DriverManager.java:571)
at java.sql.DriverManager.getConnection(DriverManager.java:187)
at org.apache.hive.beeline.DatabaseConnection.connect(DatabaseConnection.java:142)
at org.apache.hive.beeline.DatabaseConnection.getConnection(DatabaseConnection.java:207)
at org.apache.hive.beeline.Commands.connect(Commands.java:1149)
at org.apache.hive.beeline.Commands.connect(Commands.java:1070)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hive.beeline.ReflectiveCommandHandler.execute(ReflectiveCommandHandler.java:52)
at org.apache.hive.beeline.BeeLine.dispatch(BeeLine.java:970)
at org.apache.hive.beeline.BeeLine.initArgs(BeeLine.java:707)
at org.apache.hive.beeline.BeeLine.begin(BeeLine.java:757)
at org.apache.hive.beeline.BeeLine.mainWithInputRedirection(BeeLine.java:484)
at org.apache.hive.beeline.BeeLine.main(BeeLine.java:467)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
Caused by: GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)
at sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:147)
at sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:121)
at sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:187)
at sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:223)
at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:212)
at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179)
at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:193)
... 34 more
Error: Could not open client transport with JDBC Uri: jdbc:hive2://localhost:10000/;principal=hive/sandbox.hortonworks.com@EXAMPLE.COM: GSS initiate failed (state=08S01,code=0)
Beeline version 1.2.1.2.3.2.0-2950 by Apache Hive
0: jdbc:hive2://localhost:10000/ (closed)>
:(

Rising Star

Hi and thanks for dialog.

I can log in using hive command.

And I see from /var/log/krb5kdc.log that there is communication. Using beeline there is silence.

beeline still gives: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]

I can not understand that message quite well. It is saying that there is no tgt.

I even made a new user for testing - margusja, because in some documentation recommended to use separated user than hive.

[margusja@sandbox ~]$ klist -f -e
Ticket cache: FILE:/tmp/krb5cc_1024
Default principal: margusja@EXAMPLE.COM
Valid starting     Expires            Service principal
01/10/16 16:05:29  01/11/16 16:05:29  krbtgt/EXAMPLE.COM@EXAMPLE.COM
renew until 01/17/16 16:05:29, Flags: FRI
Etype (skey, tkt): arcfour-hmac, aes256-cts-hmac-sha1-96

Above means that I have tgt?

How beeline checks tgt?

Any help is welcome.

Br, Margusja

Guru

Hi @Margus Roo ,

does the hive user on the Hiveserver node have a valid Kerberos ticket as well ?

Try to re-init one for user 'hive'.

I had similar issue in certain versions, where the ticket for user 'hive' hasn't been updated automatically....

Rising Star

Tried to re-init:

[margusja@sandbox ~]$ klist -e -f
Ticket cache: FILE:/tmp/krb5cc_1024
Default principal: margusja@EXAMPLE.COM
Valid starting     Expires            Service principal
01/10/16 16:21:10  01/11/16 16:21:10  krbtgt/EXAMPLE.COM@EXAMPLE.COM
renew until 01/17/16 16:21:10, Flags: FRI
Etype (skey, tkt): arcfour-hmac, aes256-cts-hmac-sha1-96
[margusja@sandbox ~]$

And I can re-init:

[margusja@sandbox ~]$ klist -e -f
Ticket cache: FILE:/tmp/krb5cc_1024
Default principal: margusja@EXAMPLE.COM
Valid starting     Expires            Service principal
01/10/16 16:34:54  01/11/16 16:34:54  krbtgt/EXAMPLE.COM@EXAMPLE.COM
renew until 01/17/16 16:21:10, Flags: FRIT
Etype (skey, tkt): arcfour-hmac, aes256-cts-hmac-sha1-96
[margusja@sandbox ~]$

unfortunately I have no success

beeline> !connect jdbc:hive2://127.0.0.1:10000/default;principal=hive/sandbox.hortonworks.com@EXAMPLE.COM
Connecting to jdbc:hive2://127.0.0.1:10000/default;principal=hive/sandbox.hortonworks.com@EXAMPLE.COM
Enter username for jdbc:hive2://127.0.0.1:10000/default;principal=hive/sandbox.hortonworks.com@EXAMPLE.COM:
Enter password for jdbc:hive2://127.0.0.1:10000/default;principal=hive/sandbox.hortonworks.com@EXAMPLE.COM:
16/01/10 16:35:36 [main]: ERROR transport.TSaslTransport: SASL negotiation failure
javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]

I can not understand what is missing for beeline - "Failed to find any Kerberos tgt"

What beeline searching for? I have tgt in cache as you can see above.

Br, Margusja

Guru

Hi @Margus Roo , in my previous answer I meant to check the kerberos ticket for user 'hive', not for your personal user.

sudo su - hive
kdestroy
kinit -kt <path-to-keytab> hive/sandbox.hortonworks.com
klist

and then again the beeline command...

Can you try HQDN or hostname/IP instead of localhost or 127.0.0.1 in the beeline connect string?

Rising Star

Changed hostname in connection string. Used FQDN:

beeline> !connect jdbc:hive2://sandbox.hortonworks.com:10000/default;principal=hive/sandbox.hortonworks.com@EXAMPLE.COM
Connecting to jdbc:hive2://sandbox.hortonworks.com:10000/default;principal=hive/sandbox.hortonworks.com@EXAMPLE.COM
Enter username for jdbc:hive2://sandbox.hortonworks.com:10000/default;principal=hive/sandbox.hortonworks.com@EXAMPLE.COM:
Enter password for jdbc:hive2://sandbox.hortonworks.com:10000/default;principal=hive/sandbox.hortonworks.com@EXAMPLE.COM:
16/01/10 16:55:49 [main]: ERROR transport.TSaslTransport: SASL negotiation failure
javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]

Can not understand. Hiveserver2 log does not help there is the same as in beeline output.

I can re-init.

Is there any method to validate my tgt?

Its good for hdfs

[margusja@sandbox ~]$ hdfs dfs -ls /
Found 9 items

Contributor

A few things to double check:

1. Is there any message on the Hiveserver2 that correlates to the Beeline error?

2. Can you double check that you have the Unlimited Strength Key JCE Policy installed correctly and that alternatives is pointing at the copy of java that has this poilcy (I'm noticing that your KRB Ticket only supports AES256)

3. Set system property sun.security.krb5.debug to true on your JVM and see if you can get any details from the debug logs.

Explorer

You can use beeline to connect from an edge-node server to hiveserver2. Below is an example:

beeline -u "jdbc:hive2://127.0.0.1:10000/default;principal=hive/sandbox.hortonworks.com@EXAMPLE.COM;auth-kerberos" -n <user>

They key part of this example is the JDBC URL that has to be provided for Kerberos authentication to work correctly. Note the main sections of the JDBC URL. jdbc:hive2://127.0.0.1:10000/default principal=hive/sandbox.hortonworks.com@EXAMPLE.COM; auth=kerberos

The first part is a standard JDBC URL that provides information about the driver (hive2), the hostname (127.0.0.1), the port number (10000), and the default database (default).

The second part is special to Kerberos. It tells you what service principal is used to authenticate to this URL.

And the final step is to tell JDBC that you definitely want to do Kerberos authentication (auth=kerberos)

You'll also note that the commandline for beeline included a specification that I wanted to connect with a specific username (-n <user> ). This is required so that beeline knows what specific kerberos TGT to look for.

All of this assumes that when you login to the edge node server, you followed standard protocol to get a kerberos TGT. (The profile is setup so that you're automatically prompted again for your password. This establishes your TGT.)

Mentor

@Margus Roo are you still having issues with this? Can you accept best answer or provide your own solution?

Rising Star

I do not know is it solution here but one helpful think is to enable kerberos debug mode to see what kerberos wants:

export HADOOP_OPTS="-Dsun.security.krb5.debug=true"

It helped me

New Contributor

Hi Margus, I am facing similar issue and setting the debug flag is not helping me much. I tried all the various ways of login to beeline, with and without hive services tickets and also with different TGTs.

Following are some of my observations,

I can login to Hive CLI successfully

Ambari hive service check passed.

With a valid hive (or other) TGT, I am able to list the hdfs directories (hadoop fs -ls /).

Would you see any other check I might need to do here? Was your issue based on similar lines? Would you mind sharing the fix you made for the problem.

Thanks,

Bala

Explorer

Did you resolve the problem?

I have the same issue, in the sandbox 2.4.

New Contributor

The issue with beeline access to hive when using Kerberos, is that we need to use the "right principal" in the connection string - and it MUST be hive's principal.

1. So you must explicitly do a kinit and grab a valid ticket from Kerberos. 2. After you have a valid ticket - you can use the following URL to connect using beeline: beeline -u "jdbc:hive2://<hive_server_name>:10000/<db_name>;principal=hive/<hostname>@<realm_name>

This will do the trick.

Super Collaborator

BEWARE !!

In the connection string below:

beeline -u "jdbc:hive2://rjk-hdp25-s-01:2181,rjk-hdp25-s-02:2181/;serviceDiscoveryMode=zooKeeper;zooKeeperNamespace=hiveserver2;principal=hive/rjk-hdp25-m-02@FIELD.HORTONWORKS.COM"

the principal hive/rjk-hdp25-m-02@FIELD.HORTONWORKS.COM is actually the one running the Hiveserver2

AND NOT

the user that you have just used to kinit'ed with yourself just before you fired the beeline command !!!

Then it works. It is very very confusing, so beware!

To survive a hiveserver2 HA failover, the syntax:

hive/_HOST@FIELD.HORTONWORKS.COM

also works

Explorer

I solved this problem after adding this property to core-site.xml.

<configuration>
    <property>
      <name>hadoop.security.authentication</name>
      <value>kerberos</value>
    </property>
</configuration>