Support Questions

Find answers, ask questions, and share your expertise

Hiveserver2 SSL with Kerberos authentication

avatar
Master Guru

I have HDP-2.3.4.0 Kerberized cluster and I have enabled SSL for hiveserver2 using this documentation link

hiveserver2 daemon is running fine however I'm unable to connect to hiveserver2 using beeline.

I have valid kerberos ticket

[vagrant@ambari-slave1 ~]$ klist
Ticket cache: FILE:/tmp/krb5cc_500
Default principal: vagrant@SUPPORT.COM
Valid starting     Expires            Service principal
04/06/16 11:08:20  04/07/16 11:08:19  krbtgt/SUPPORT.COM@SUPPORT.COM
renew until 04/06/16 11:08:20
[vagrant@ambari-slave1 ~]$ date
Wed Apr  6 13:53:26 UTC 2016
[vagrant@ambari-slave1 ~]$

I tried below commands however none of them is working

Command 1:

!connect jdbc:hive2://ambari-slave1.support.com:10000/;ssl=true;sslTrustStore=/etc/hive/conf/hive.jks;trustStorePassword=password;

Error:

Error: Could not open client transport with JDBC Uri: jdbc:hive2://ambari-slave1.support.com:10000/;ssl=true;sslTrustStore=/etc/hive/conf/hive.jks;trustStorePassword=password;: Peer indicated failure: Unsupported mechanism type PLAIN (state=08S01,code=0)

Error in hiveserver2 logs:

ERROR [HiveServer2-Handler-Pool: Thread-44]: server.TThreadPoolServer (TThreadPoolServer.java:run(296)) - Error occurred during processing of message.
java.lang.RuntimeException: org.apache.thrift.transport.TTransportException: javax.net.ssl.SSLException: Unrecognized SSL message, plaintext connection?
        at org.apache.thrift.transport.TSaslServerTransport$Factory.getTransport(TSaslServerTransport.java:219)
        at org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge$Server$TUGIAssumingTransportFactory$1.run(HadoopThriftAuthBridge.java:739)
        at org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge$Server$TUGIAssumingTransportFactory$1.run(HadoopThriftAuthBridge.java:736)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:360)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1637)
        at org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge$Server$TUGIAssumingTransportFactory.getTransport(HadoopThriftAuthBridge.java:736)
        at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:268)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
        at java.lang.Thread.run(Thread.java:745)
Caused by: org.apache.thrift.transport.TTransportException: javax.net.ssl.SSLException: Unrecognized SSL message, plaintext connection?
        at org.apache.thrift.transport.TIOStreamTransport.read(TIOStreamTransport.java:129)
        at org.apache.thrift.transport.TTransport.readAll(TTransport.java:86)
        at org.apache.thrift.transport.TSaslTransport.receiveSaslMessage(TSaslTransport.java:178)
        at org.apache.thrift.transport.TSaslServerTransport.handleSaslStartMessage(TSaslServerTransport.java:125)
        at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:271)
        at org.apache.thrift.transport.TSaslServerTransport.open(TSaslServerTransport.java:41)
        at org.apache.thrift.transport.TSaslServerTransport$Factory.getTransport(TSaslServerTransport.java:216)
        ... 10 more
Caused by: javax.net.ssl.SSLException: Unrecognized SSL message, plaintext connection?
        at sun.security.ssl.InputRecord.handleUnknownRecord(InputRecord.java:710)
        at sun.security.ssl.InputRecord.read(InputRecord.java:527)
        at sun.security.ssl.SSLSocketImpl.readRecord(SSLSocketImpl.java:973)
        at sun.security.ssl.SSLSocketImpl.performInitialHandshake(SSLSocketImpl.java:1375)
        at sun.security.ssl.SSLSocketImpl.readDataRecord(SSLSocketImpl.java:928)
        at sun.security.ssl.AppInputStream.read(AppInputStream.java:105)
        at java.io.BufferedInputStream.fill(BufferedInputStream.java:246)
        at java.io.BufferedInputStream.read1(BufferedInputStream.java:286)
        at java.io.BufferedInputStream.read(BufferedInputStream.java:345)
        at org.apache.thrift.transport.TIOStreamTransport.read(TIOStreamTransport.java:127)
        ... 16 more

Command 2 with error:

beeline> !connect jdbc:hive2://ambari-slave1.support.com:10000/default;ssl=true;sslTrustStore=/etc/hive/conf/hive.jks;trustStorePassword=password;principal=hive/ambari-slave1.support.com@SUPPORT.COM;
Connecting to jdbc:hive2://ambari-slave1.support.com:10000/default;ssl=true;sslTrustStore=/etc/hive/conf/hive.jks;trustStorePassword=password;principal=hive/ambari-slave1.support.com@SUPPORT.COM;
Enter username for jdbc:hive2://ambari-slave1.support.com:10000/default;ssl=true;sslTrustStore=/etc/hive/conf/hive.jks;trustStorePassword=password;principal=hive/ambari-slave1.support.com@SUPPORT.COM;:
Enter password for jdbc:hive2://ambari-slave1.support.com:10000/default;ssl=true;sslTrustStore=/etc/hive/conf/hive.jks;trustStorePassword=password;principal=hive/ambari-slave1.support.com@SUPPORT.COM;:
16/04/06 13:57:05 [main]: WARN transport.TSaslTransport: Could not send failure response
org.apache.thrift.transport.TTransportException: java.net.SocketException: Broken pipe
at org.apache.thrift.transport.TIOStreamTransport.flush(TIOStreamTransport.java:161)
at org.apache.thrift.transport.TSaslTransport.sendSaslMessage(TSaslTransport.java:166)
at org.apache.thrift.transport.TSaslTransport.sendAndThrowMessage(TSaslTransport.java:227)
at org.apache.thrift.transport.TSaslTransport.receiveSaslMessage(TSaslTransport.java:184)
at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:277)
at org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37)
at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:52)
at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:49)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49)
at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:185)
at org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:156)
at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:105)
at java.sql.DriverManager.getConnection(DriverManager.java:664)
at java.sql.DriverManager.getConnection(DriverManager.java:208)
at org.apache.hive.beeline.DatabaseConnection.connect(DatabaseConnection.java:142)
at org.apache.hive.beeline.DatabaseConnection.getConnection(DatabaseConnection.java:207)
at org.apache.hive.beeline.Commands.connect(Commands.java:1149)
at org.apache.hive.beeline.Commands.connect(Commands.java:1070)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.apache.hive.beeline.ReflectiveCommandHandler.execute(ReflectiveCommandHandler.java:52)
at org.apache.hive.beeline.BeeLine.dispatch(BeeLine.java:980)
at org.apache.hive.beeline.BeeLine.execute(BeeLine.java:823)
at org.apache.hive.beeline.BeeLine.begin(BeeLine.java:781)
at org.apache.hive.beeline.BeeLine.mainWithInputRedirection(BeeLine.java:485)
at org.apache.hive.beeline.BeeLine.main(BeeLine.java:468)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
Caused by: java.net.SocketException: Broken pipe
at java.net.SocketOutputStream.socketWrite0(Native Method)
at java.net.SocketOutputStream.socketWrite(SocketOutputStream.java:109)
at java.net.SocketOutputStream.write(SocketOutputStream.java:153)
at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82)
at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140)
at org.apache.thrift.transport.TIOStreamTransport.flush(TIOStreamTransport.java:159)
... 36 more
Error: Could not open client transport with JDBC Uri: jdbc:hive2://ambari-slave1.support.com:10000/default;ssl=true;sslTrustStore=/etc/hive/conf/hive.jks;trustStorePassword=password;principal=hive/ambari-slave1.support.com@SUPPORT.COM;: Invalid status 21
Also, could not send response: org.apache.thrift.transport.TTransportException: java.net.SocketException: Broken pipe (state=08S01,code=0)

Note - Same error in hiveserver2 logs as given above.

Command 3(with auth=noSasl):

!connect jdbc:hive2://ambari-slave1.support.com:10000/default;auth=noSasl;ssl=true;sslTrustStore=/etc/hive/conf/hive.jks;trustStorePassword=password;principal=hive/ambari-slave1.support.com@SUPPORT.COM;
Connecting to jdbc:hive2://ambari-slave1.support.com:10000/default;auth=noSasl;ssl=true;sslTrustStore=/etc/hive/conf/hive.jks;trustStorePassword=password;principal=hive/ambari-slave1.support.com@SUPPORT.COM;
Enter username for jdbc:hive2://ambari-slave1.support.com:10000/default;auth=noSasl;ssl=true;sslTrustStore=/etc/hive/conf/hive.jks;trustStorePassword=password;principal=hive/ambari-slave1.support.com@SUPPORT.COM;:
Enter password for jdbc:hive2://ambari-slave1.support.com:10000/default;auth=noSasl;ssl=true;sslTrustStore=/etc/hive/conf/hive.jks;trustStorePassword=password;principal=hive/ambari-slave1.support.com@SUPPORT.COM;:
16/04/06 13:59:54 [main]: ERROR jdbc.HiveConnection: Error opening session
org.apache.thrift.transport.TTransportException
at org.apache.thrift.transport.TIOStreamTransport.read(TIOStreamTransport.java:132)
at org.apache.thrift.transport.TTransport.readAll(TTransport.java:86)
at org.apache.thrift.protocol.TBinaryProtocol.readStringBody(TBinaryProtocol.java:380)
at org.apache.thrift.protocol.TBinaryProtocol.readMessageBegin(TBinaryProtocol.java:230)
at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:69)
at org.apache.hive.service.cli.thrift.TCLIService$Client.recv_OpenSession(TCLIService.java:156)
at org.apache.hive.service.cli.thrift.TCLIService$Client.OpenSession(TCLIService.java:143)
at org.apache.hive.jdbc.HiveConnection.openSession(HiveConnection.java:562)
at org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:171)
at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:105)
at java.sql.DriverManager.getConnection(DriverManager.java:664)
at java.sql.DriverManager.getConnection(DriverManager.java:208)
at org.apache.hive.beeline.DatabaseConnection.connect(DatabaseConnection.java:142)
at org.apache.hive.beeline.DatabaseConnection.getConnection(DatabaseConnection.java:207)
at org.apache.hive.beeline.Commands.connect(Commands.java:1149)
at org.apache.hive.beeline.Commands.connect(Commands.java:1070)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.apache.hive.beeline.ReflectiveCommandHandler.execute(ReflectiveCommandHandler.java:52)
at org.apache.hive.beeline.BeeLine.dispatch(BeeLine.java:980)
at org.apache.hive.beeline.BeeLine.execute(BeeLine.java:823)
at org.apache.hive.beeline.BeeLine.begin(BeeLine.java:781)
at org.apache.hive.beeline.BeeLine.mainWithInputRedirection(BeeLine.java:485)
at org.apache.hive.beeline.BeeLine.main(BeeLine.java:468)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
Error: Could not establish connection to jdbc:hive2://ambari-slave1.support.com:10000/default;auth=noSasl;ssl=true;sslTrustStore=/etc/hive/conf/hive.jks;trustStorePassword=password;principal=hive/ambari-slave1.support.com@SUPPORT.COM;: null (state=08S01,code=0)

Note - Same error in hiveserver2 logs as given above.

When I tried SSL on hiveserver2 without Kerberos on same setup, its working fine without any issue.

Hive/Security experts - Please help! 🙂

1 ACCEPTED SOLUTION

avatar

Don't use SSL or SSL=true with Kerberos.

SSL and Kerberos is not compatible.

Kerberos uses SASL

hive.server2.thrift.sasl.qop in hive-site.xml 

has to be set to one of the valid QOP values

('auth', 'auth-int' or 'auth-conf').

https://cwiki.apache.org/confluence/display/Hive/Setting+Up+HiveServer2#SettingUpHiveServer2-Integri...

Then use as for example:

jdbc:hive://hostname/dbname;sasl.qop=auth-int|auth|auth-conf
jdbc:hive2://sandbox.hortonworks.com:10001/default;principal=hive/sandbox.hortonworks.com@HORTONWORKS.COM?transportMode=http;httpPath=cliservice;auth=kerberos;sasl.qop=auth-int 

See

https://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.3.4/bk_Security_Guide/content/ch_wire-connect.h...

View solution in original post

4 REPLIES 4

avatar

Don't use SSL or SSL=true with Kerberos.

SSL and Kerberos is not compatible.

Kerberos uses SASL

hive.server2.thrift.sasl.qop in hive-site.xml 

has to be set to one of the valid QOP values

('auth', 'auth-int' or 'auth-conf').

https://cwiki.apache.org/confluence/display/Hive/Setting+Up+HiveServer2#SettingUpHiveServer2-Integri...

Then use as for example:

jdbc:hive://hostname/dbname;sasl.qop=auth-int|auth|auth-conf
jdbc:hive2://sandbox.hortonworks.com:10001/default;principal=hive/sandbox.hortonworks.com@HORTONWORKS.COM?transportMode=http;httpPath=cliservice;auth=kerberos;sasl.qop=auth-int 

See

https://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.3.4/bk_Security_Guide/content/ch_wire-connect.h...

avatar
Master Guru

@Ancil McBarnett - Perfecto!! this works. Thanks a ton 🙂

avatar

@Kuldeep Kulkarni @Ancil McBarnett

I have an environment with HDP 2.4.2, kerberos and hiveserver ssl and it's working:

beeline -u "jdbc:hive2://fqdn.hostname:10001/;principal=hive/_HOST@DOMAIN;transportMode=http;httpPath=cliservice;ssl=true;sslTrustStore=/certs/truststore_test;trustStorePassword=XXXXXX"

Additionally, it also works thru knox, using https in topology url for hive:

<service> <role>HIVE</role> <url>https://{{hive_server_host}}:{{hive_http_port}}/{{hive_http_path}}</url> </service>

avatar
New Contributor

Hello Team,

Facing same problem, SSl with kerberos. Throws broken pipe error. Below message from beeline.

[hive@ivlhdp376 ~]$ beeline WARNING: Use "yarn jar" to launch YARN applications. Beeline version 1.2.1000.2.4.2.0-258 by Apache Hive beeline> !connect jdbc:hive2://ivlhdp376.informatica.com:10000/default;principal=hive/ivlhdp376.informatica.com@INFAKRB.INFADEV.COM;ssl=true;sslTrustStore=/etc/hive/conf/truststore.jks;trustStorePassword=password;saslQop=auth-conf Connecting to jdbc:hive2://ivlhdp376.informatica.com:10000/default;principal=hive/ivlhdp376.informatica.com@INFAKRB.INFADEV.COM;ssl=true;sslTrustStore=/etc/hive/conf/truststore.jks;trustStorePassword=password;saslQop=auth-conf Enter username for jdbc:hive2://ivlhdp376.informatica.com:10000/default;principal=hive/ivlhdp376.informatica.com@INFAKRB.INFADEV.COM;ssl=true;sslTrustStore=/etc/hive/conf/truststore.jks;trustStorePassword=password;saslQop=auth-conf: rangerhive/ivlhdp376.informatica.com@INFAKRB.INFADEV.COM Enter password for jdbc:hive2://ivlhdp376.informatica.com:10000/default;principal=hive/ivlhdp376.informatica.com@INFAKRB.INFADEV.COM;ssl=true;sslTrustStore=/etc/hive/conf/truststore.jks;trustStorePassword=password;saslQop=auth-conf: ********** Error: Could not open client transport with JDBC Uri: jdbc:hive2://ivlhdp376.informatica.com:10000/default;principal=hive/ivlhdp376.informatica.com@INFAKRB.INFADEV.COM;ssl=true;sslTrustStore=/etc/hive/conf/truststore.jks;trustStorePassword=password;saslQop=auth-conf: Invalid status 21 (state=08S01,code=0)

,

Hello Team,

I am facing trouble with same, using HDP 2.4.2 with ambari 2.2. Kerberos enabled and running. SSL working fine for hdfs,yarn,mapreduce.

But its not working with hiveserver2. getting below message in beeline.

beeline> !connect jdbc:hive2://ivlhdp376.informatica.com:10000/default;principal=hive/xxx@yyyy;ssl=true;sslTrustStore=/etc/hive/conf/truststore.jks;trustStorePassword=password;saslQop=auth-conf Connecting to jdbc:hive2://xxx:10000/default;principal=hive/xxx@yyyy;ssl=true;sslTrustStore=/etc/hive/conf/truststore.jks;trustStorePassword=password;saslQop=auth-conf Enter username for jdbc:hive2://xxx:10000/default;principal=hive/xxx@yyyy;ssl=true;sslTrustStore=/etc/hive/conf/truststore.jks;trustStorePassword=password;saslQop=auth-conf: rangerhive/xxx@yyyy Enter password for jdbc:hive2://xxx:10000/default;principal=hive/xxx@yyyy;ssl=true;sslTrustStore=/etc/hive/conf/truststore.jks;trustStorePassword=password;saslQop=auth-conf: ********** Error: Could not open client transport with JDBC Uri: jdbc:hive2://xxx:10000/default;principal=hive/xxx@yyyy;ssl=true;sslTrustStore=/etc/hive/conf/truststore.jks;trustStorePassword=password;saslQop=auth-conf: Invalid status 21 (state=08S01,code=0) 0: jdbc:hive2://ivlhdp376.informatica.com:100 (closed)>

logserver give broken pipe error.

Help me to fix it.