Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

Access Hive through Knox error

avatar
Master Collaborator

I am following the security lab and getting to lab 8 for knox.

https://github.com/HortonworksUniversity/Security_Labs#lab-8

It was all fine for WEBHDFS steps but I got some error in HIVE step

when I use openssl to imprt the self-signed cert, i got some error. not sure if that was the cause

knoxserver=$(hostname -f)
openssl s_client -connect ${knoxserver}:8443 <<<'' | openssl x509 -out /tmp/knox.crt


depth=0 C = US, ST = Test, L = Test, O = Hadoop, OU = Test, CN = qwang-hdp5.field.hortonworks.com
verify error:num=18:self signed certificate
verify return:1
depth=0 C = US, ST = Test, L = Test, O = Hadoop, OU = Test, CN = qwang-hdp5.field.hortonworks.com
verify return:1
DONE

On beeline node, the cert was imported fine with the following command

keytool -import -trustcacerts -keystore /etc/pki/java/cacerts -storepass changeit -noprompt -alias knox -file /tmp/knox.crt

But when I use the following cmd to access hive, i get error looks like the user was not granted access in Ranger, but the user is already include in "all - topology, service" policy

beeline -u "jdbc:hive2://knoxnode:8443/;ssl=true;transportMode=http;httpPath=gateway/default/hive" -n hadoopadmin -p password

Error from beeline

17/01/05 15:22:19 [main]: ERROR jdbc.HiveConnection: Error opening session
org.apache.thrift.transport.TTransportException: javax.net.ssl.SSLHandshakeException: sun.security.validator.ValidatorException: PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target
	at org.apache.thrift.transport.THttpClient.flushUsingHttpClient(THttpClient.java:297)
	at org.apache.thrift.transport.THttpClient.flush(THttpClient.java:313)
	at org.apache.thrift.TServiceClient.sendBase(TServiceClient.java:73)
	at org.apache.thrift.TServiceClient.sendBase(TServiceClient.java:62)
	at org.apache.hive.service.cli.thrift.TCLIService$Client.send_OpenSession(TCLIService.java:154)
	at org.apache.hive.service.cli.thrift.TCLIService$Client.OpenSession(TCLIService.java:146)
	at org.apache.hive.jdbc.HiveConnection.openSession(HiveConnection.java:552)
	at org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:170)
	at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:105)
	at java.sql.DriverManager.getConnection(DriverManager.java:664)
	at java.sql.DriverManager.getConnection(DriverManager.java:208)
	at org.apache.hive.beeline.DatabaseConnection.connect(DatabaseConnection.java:146)
	at org.apache.hive.beeline.DatabaseConnection.getConnection(DatabaseConnection.java:211)
	at org.apache.hive.beeline.Commands.connect(Commands.java:1190)
	at org.apache.hive.beeline.Commands.connect(Commands.java:1086)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.apache.hive.beeline.ReflectiveCommandHandler.execute(ReflectiveCommandHandler.java:52)
	at org.apache.hive.beeline.BeeLine.dispatch(BeeLine.java:990)
	at org.apache.hive.beeline.BeeLine.initArgs(BeeLine.java:715)
	at org.apache.hive.beeline.BeeLine.begin(BeeLine.java:777)
	at org.apache.hive.beeline.BeeLine.mainWithInputRedirection(BeeLine.java:491)
	at org.apache.hive.beeline.BeeLine.main(BeeLine.java:474)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.apache.hadoop.util.RunJar.run(RunJar.java:233)
	at org.apache.hadoop.util.RunJar.main(RunJar.java:148)
Caused by: javax.net.ssl.SSLHandshakeException: sun.security.validator.ValidatorException: PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target
	at sun.security.ssl.Alerts.getSSLException(Alerts.java:192)
	at sun.security.ssl.SSLSocketImpl.fatal(SSLSocketImpl.java:1949)
	at sun.security.ssl.Handshaker.fatalSE(Handshaker.java:302)
	at sun.security.ssl.Handshaker.fatalSE(Handshaker.java:296)
	at sun.security.ssl.ClientHandshaker.serverCertificate(ClientHandshaker.java:1509)
	at sun.security.ssl.ClientHandshaker.processMessage(ClientHandshaker.java:216)
	at sun.security.ssl.Handshaker.processLoop(Handshaker.java:979)
	at sun.security.ssl.Handshaker.process_record(Handshaker.java:914)
	at sun.security.ssl.SSLSocketImpl.readRecord(SSLSocketImpl.java:1062)
	at sun.security.ssl.SSLSocketImpl.performInitialHandshake(SSLSocketImpl.java:1375)
	at sun.security.ssl.SSLSocketImpl.startHandshake(SSLSocketImpl.java:1403)
	at sun.security.ssl.SSLSocketImpl.startHandshake(SSLSocketImpl.java:1387)
	at org.apache.http.conn.ssl.SSLConnectionSocketFactory.createLayeredSocket(SSLConnectionSocketFactory.java:395)
	at org.apache.http.conn.ssl.SSLConnectionSocketFactory.connectSocket(SSLConnectionSocketFactory.java:354)
	at org.apache.http.impl.conn.DefaultHttpClientConnectionOperator.connect(DefaultHttpClientConnectionOperator.java:134)
	at org.apache.http.impl.conn.BasicHttpClientConnectionManager.connect(BasicHttpClientConnectionManager.java:338)
	at org.apache.http.impl.execchain.MainClientExec.establishRoute(MainClientExec.java:380)
	at org.apache.http.impl.execchain.MainClientExec.execute(MainClientExec.java:236)
	at org.apache.http.impl.execchain.ProtocolExec.execute(ProtocolExec.java:184)
	at org.apache.http.impl.execchain.RetryExec.execute(RetryExec.java:88)
	at org.apache.http.impl.execchain.RedirectExec.execute(RedirectExec.java:110)
	at org.apache.http.impl.execchain.ServiceUnavailableRetryExec.execute(ServiceUnavailableRetryExec.java:84)
	at org.apache.http.impl.client.InternalHttpClient.doExecute(InternalHttpClient.java:184)
	at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:117)
	at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:55)
	at org.apache.thrift.transport.THttpClient.flushUsingHttpClient(THttpClient.java:251)
	... 30 more
Caused by: sun.security.validator.ValidatorException: PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target
	at sun.security.validator.PKIXValidator.doBuild(PKIXValidator.java:387)
	at sun.security.validator.PKIXValidator.engineValidate(PKIXValidator.java:292)
	at sun.security.validator.Validator.validate(Validator.java:260)
	at sun.security.ssl.X509TrustManagerImpl.validate(X509TrustManagerImpl.java:324)
	at sun.security.ssl.X509TrustManagerImpl.checkTrusted(X509TrustManagerImpl.java:229)
	at sun.security.ssl.X509TrustManagerImpl.checkServerTrusted(X509TrustManagerImpl.java:124)
	at sun.security.ssl.ClientHandshaker.serverCertificate(ClientHandshaker.java:1491)
	... 51 more
Caused by: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target
	at sun.security.provider.certpath.SunCertPathBuilder.build(SunCertPathBuilder.java:141)
	at sun.security.provider.certpath.SunCertPathBuilder.engineBuild(SunCertPathBuilder.java:126)
	at java.security.cert.CertPathBuilder.build(CertPathBuilder.java:280)
	at sun.security.validator.PKIXValidator.doBuild(PKIXValidator.java:382)
	... 57 more
Error: Could not establish connection to jdbc:hive2://qwang-hdp5.field.hortonworks.com:8443/;ssl=true;transportMode=http;httpPath=gateway/default/hive: javax.net.ssl.SSLHandshakeException: sun.security.validator.ValidatorException: PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target (state=08S01,code=0)

Looking at the knox log, it complains about error writting to log file

2017-01-05 03:01:24,993 ERROR provider.BaseAuditHandler (BaseAuditHandler.java:logError(329)) - Error writing to log file.
org.apache.hadoop.security.AccessControlException: SIMPLE authentication is not enabled.  Available:[TOKEN, KERBEROS]
	at sun.reflect.GeneratedConstructorAccessor32.newInstance(Unknown Source)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
	at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
	at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:73)
	at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:2160)
	at org.apache.hadoop.hdfs.DistributedFileSystem$25.doCall(DistributedFileSystem.java:1423)
	at org.apache.hadoop.hdfs.DistributedFileSystem$25.doCall(DistributedFileSystem.java:1419)
	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
	at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1419)
	at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1443)
	at org.apache.ranger.audit.destination.HDFSAuditDestination.getLogFileStream(HDFSAuditDestination.java:273)
	at org.apache.ranger.audit.destination.HDFSAuditDestination.access$000(HDFSAuditDestination.java:44)
	at org.apache.ranger.audit.destination.HDFSAuditDestination$1.run(HDFSAuditDestination.java:159)
	at org.apache.ranger.audit.destination.HDFSAuditDestination$1.run(HDFSAuditDestination.java:156)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:422)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724)
	at org.apache.ranger.audit.destination.HDFSAuditDestination.logJSON(HDFSAuditDestination.java:170)
	at org.apache.ranger.audit.queue.AuditFileSpool.sendEvent(AuditFileSpool.java:880)
	at org.apache.ranger.audit.queue.AuditFileSpool.runLogAudit(AuditFileSpool.java:828)
	at org.apache.ranger.audit.queue.AuditFileSpool.run(AuditFileSpool.java:758)
	at java.lang.Thread.run(Thread.java:745)
Caused by: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.AccessControlException): SIMPLE authentication is not enabled.  Available:[TOKEN, KERBEROS]
	at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1552)
	at org.apache.hadoop.ipc.Client.call(Client.java:1496)
	at org.apache.hadoop.ipc.Client.call(Client.java:1396)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:233)
	at com.sun.proxy.$Proxy48.getFileInfo(Unknown Source)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:816)
	at sun.reflect.GeneratedMethodAccessor50.invoke(Unknown Source)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:278)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:194)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:176)
	at com.sun.proxy.$Proxy49.getFileInfo(Unknown Source)
	at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:2158)
	... 17 more
2017-01-05 03:01:24,995 ERROR queue.AuditFileSpool (AuditFileSpool.java:logError(710)) - Error sending logs to consumer. provider=knox.async.multi_dest.batch, consumer=knox.async.multi_dest.batch.hdfs

I couldn't make sense of the error, what does it really mean?

1 ACCEPTED SOLUTION

avatar

That stack trace error in beeline seems clear to me:

org.apache.thrift.transport.TTransportException: javax.net.ssl.SSLHandshakeException: sun.security.validator.ValidatorException: PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target

To fix you need to know what java beeline is using. Do a ps -ef | grep beeline to see. Like so..

root@chupa1 ~]# ps -ef | grep beeline

root      4239  4217  2 16:20 pts/0    00:00:01 /usr/jdk64/jdk1.8.0_77/bin/java -Xmx1024m -Dhdp.version=2.5.0.0-1133 -Djava.net.preferIPv4Stack=true -Dhdp.version=2.5.0.0-1133 -Dhadoop.log.dir=/var/log/hadoop/root -Dhadoop.log.file=hadoop.log -Dhadoop.home.dir=/usr/hdp/2.5.0.0-1133/hadoop -Dhadoop.id.str=root -Dhadoop.root.logger=INFO,console -Djava.library.path=:/usr/hdp/2.5.0.0-1133/hadoop/lib/native/Linux-amd64-64:/usr/hdp/2.5.0.0-1133/hadoop/lib/native -Dhadoop.policy.file=hadoop-policy.xml -Djava.net.preferIPv4Stack=true -Xmx1024m -Xmx1024m -Djava.util.logging.config.file=/usr/hdp/2.5.0.0-1133/hive/conf/parquet-logging.properties -Dlog4j.configuration=beeline-log4j.properties -Dhadoop.security[.logger=INFO,NullAppender org.apache.hadoop.util.RunJar /usr/hdp/2.5.0.0-1133/hive/lib/hive-beeline-1.2.1000.2.5.0.0-1133.jar org.apache.hive.beeline.BeeLine

Based on my output I would import my knox trust certificate to the cacert that my beeline client is using in my case

/usr/jdk64/jdk1.8.0_77/jre/lib/security/cacert

The import now would look like

keytool -import-trustcacerts -keystore /usr/jdk64/jdk1.8.0_77/jre/lib/security/cacert -storepass changeit -noprompt -alias knox -file /tmp/knox.crt

and restart beeline client to move past the error.

The issue here is definitely with SSL.

View solution in original post

10 REPLIES 10

avatar

@Qi Wang

The connection to beeline is using an simple authentication with a user name and password. The server is configured with a different authentication mechanism. Perhaps you need to use kerberos to authenticate instead?

sudo su - sales1
kdestroy
beeline -u "jdbc:hive2://localhost:10000/default;principal=hive/$(hostname -f)@LAB.HORTONWORKS.NET"

avatar
Master Collaborator

the intention is to access hive through knox, not directly.

avatar
Guru

Hello @Qi Wang,

1. You have not shown what error you are getting in beeline. Please paste that error.

2. The error in Knox log is due to the fact that Ranger audit is not able to access "kerberized" HDFS service. This is mostly an audit error and should not block beeline access.

3. IMO, the actual error (if any) would be in HiveServer2 log (/var/log/hive/hiveserver2.log) file. Please check that for any clues.

To fix the Ranger audit error, either you can check the ranger-knox-audit configuration / enable DEBUG logging for Knox OR you can turn off 'Ranger audit to HDFS'.

Hope this helps.

avatar
Master Collaborator

@Vipin Rathor

Just included the beeline error in original message. I tailed /var/log/hive/hiveserver2.log and did not see any error with beeline command pop errors.

From the beeline error, it seems to be related to ssl, that was why I included the cmd I import cert on both nodes.

avatar

That stack trace error in beeline seems clear to me:

org.apache.thrift.transport.TTransportException: javax.net.ssl.SSLHandshakeException: sun.security.validator.ValidatorException: PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target

To fix you need to know what java beeline is using. Do a ps -ef | grep beeline to see. Like so..

root@chupa1 ~]# ps -ef | grep beeline

root      4239  4217  2 16:20 pts/0    00:00:01 /usr/jdk64/jdk1.8.0_77/bin/java -Xmx1024m -Dhdp.version=2.5.0.0-1133 -Djava.net.preferIPv4Stack=true -Dhdp.version=2.5.0.0-1133 -Dhadoop.log.dir=/var/log/hadoop/root -Dhadoop.log.file=hadoop.log -Dhadoop.home.dir=/usr/hdp/2.5.0.0-1133/hadoop -Dhadoop.id.str=root -Dhadoop.root.logger=INFO,console -Djava.library.path=:/usr/hdp/2.5.0.0-1133/hadoop/lib/native/Linux-amd64-64:/usr/hdp/2.5.0.0-1133/hadoop/lib/native -Dhadoop.policy.file=hadoop-policy.xml -Djava.net.preferIPv4Stack=true -Xmx1024m -Xmx1024m -Djava.util.logging.config.file=/usr/hdp/2.5.0.0-1133/hive/conf/parquet-logging.properties -Dlog4j.configuration=beeline-log4j.properties -Dhadoop.security[.logger=INFO,NullAppender org.apache.hadoop.util.RunJar /usr/hdp/2.5.0.0-1133/hive/lib/hive-beeline-1.2.1000.2.5.0.0-1133.jar org.apache.hive.beeline.BeeLine

Based on my output I would import my knox trust certificate to the cacert that my beeline client is using in my case

/usr/jdk64/jdk1.8.0_77/jre/lib/security/cacert

The import now would look like

keytool -import-trustcacerts -keystore /usr/jdk64/jdk1.8.0_77/jre/lib/security/cacert -storepass changeit -noprompt -alias knox -file /tmp/knox.crt

and restart beeline client to move past the error.

The issue here is definitely with SSL.

avatar
Master Collaborator

@dvillarreal

I followed your steps, the java version is the same as you mentioned

[root@qwang-hdp2 ~]# ps -ef | grep beeline
root     32310 26098 39 16:58 pts/0    00:00:05 /usr/jdk64/jdk1.8.0_77/bin/java -Xmx1024m -Dhdp.version=2.5.3.0-37 -Djava.net.preferIPv4Stack=true -Dhdp.version=2.5.3.0-37 -Dhadoop.log.dir=/var/log/hadoop/root -Dhadoop.log.file=hadoop.log -Dhadoop.home.dir=/usr/hdp/2.5.3.0-37/hadoop -Dhadoop.id.str=root -Dhadoop.root.logger=INFO,console -Djava.library.path=:/usr/hdp/2.5.3.0-37/hadoop/lib/native/Linux-amd64-64:/usr/hdp/2.5.3.0-37/hadoop/lib/native -Dhadoop.policy.file=hadoop-policy.xml -Djava.net.preferIPv4Stack=true -Xmx1024m -Xmx1024m -Djava.util.logging.config.file=/usr/hdp/2.5.3.0-37/hive/conf/parquet-logging.properties -Dlog4j.configuration=beeline-log4j.properties -Dhadoop.security.logger=INFO,NullAppender org.apache.hadoop.util.RunJar /usr/hdp/2.5.3.0-37/hive/lib/hive-beeline-1.2.1000.2.5.3.0-37.jar org.apache.hive.beeline.BeeLine -u jdbc:hive2://qwang-hdp5.field.hortonworks.com:8443/;ssl=true;transportMode=http;httpPath=gateway/default/hive -n hadoopadmin -p password
root     32542 10102  0 16:58 pts/1    00:00:00 grep --color=auto beeline
 

then import the certificate to /usr/jdk64/jdk1.8.0_77/jre/lib/security/cacert

keytool -import -trustcacerts -keystore /usr/jdk64/jdk1.8.0_77/jre/lib/security/cacert -storepass changeit -noprompt -alias knox -file /tmp/knox.crt

But I still get the same error starting beeline

beeline -u "jdbc:hive2://knoxnode:8443/;ssl=true;transportMode=http;httpPath=gateway/default/hive" -n hadoopadmin -p password

17/01/05 16:58:46 [main]: ERROR jdbc.HiveConnection: Error opening session
org.apache.thrift.transport.TTransportException: javax.net.ssl.SSLHandshakeException: sun.security.validator.ValidatorException: PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target

You mentioned restart beeline client, I am not sure what exactly that means.

avatar

At this point you know it is SSL certificate issue based on the error. You need to find where the problem is. Maybe the certificate you exported is not correct. Try validating it and exporting it again. Below is a tool I use for troubleshooting. Maybe run through this and if still doesn't work download sslpoke and troubleshoot.

openssl s_client -connect <knox hostname>:<8443><<<'' | openssl x509 -out ./ssl.cert
keytool -import -alias <knoxhostname> -file ./ssl.cert -keystore usr/jdk64/jdk1.8.0_77/jre/lib/security/cacert

SYMPTOM: Sometimes a Hadoop service may fail to connect to SSL and give an error like this: javax.net.ssl.SSLHandshakeException: sun.security.validator.ValidatorException: PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target

ROOT CAUSE: Here are the possible reason: 1. The JVM used by the Hadoop service is not using the correct certificate or the correct truststore 2. The certificate is not signed by the trusted CA 3. The Java trusted CA certificate chain is not available.

HOW TO DEBUG: Here are the steps to narrow down the problem with the SSL certificate:

STEP 1: Analyze the SSL connection to the SSL enabled service (either Ranger or Knox in this case) by using SSLPoke utility. Download it from: https://confluence.atlassian.com/download/attachments/117455/SSLPoke.java It's a simple Java program which connects to server:port over SSL and tries to write a byte and returns the response.

STEP 2: Compile and run the SSLPoke like this: $ java SSLPoke <SSL-service-hostname> <SSL-service-port> If there is an error, it should print similar error as shown above.

Next, test the connection with the truststore that the Hadoop service is supposed to be using.

STEP 3: If the Hadoop service is using the default JRE truststore then import the SSL-service certificate and run the SSLPoke again 3a. Extract the certificate from the SSL service: $ openssl s_client -connect <SSL-service-hostname>:<SSL-service-port><<<'' | openssl x509 -out ./ssl.cert

3b. import certificate into JRE default truststore: $ keytool -import -alias <SSL-service-hostname> -file ./ssl.cert -keystore $JAVA_HOME/jre/lib/security/cacerts

3b. Run the SSLPoke again. $ java SSLPoke <SSL-service-hostname> <SSL-service-port> STEP 4: If the Hadoop service is using a custom SSL truststore then specify the truststore in SSLPoke command and test the connection: $ java SSLPoke -Djavax.net.ssl.trustStore=/path/to/truststore <SSL-service-hostname> <SSL-service-port> The STEP 3b and 4 commands would show some error incase there is any problem. Workup on those clues to reach to the actual problem and fix that.

STEP 5: For the correct SSL setup, the SSLPoke would show success message: $ java SSLPoke -Djavax.net.ssl.trustStore=/path/to/truststore <SSL-service-hostname> <SSL-service-port> Successfully connected

So keep playing until SSL connection is successful. Then replicate the similar successful settings for the Hadoop service and it should work.

avatar
Master Collaborator

@dvillarreal

So I used SSLPoke to connect to knox server and it works fine

[root@qwang-hdp2 sslpoke]# java -Djavax.net.ssl.trustStore=/usr/jdk64/jdk1.8.0_77/jre/lib/security/cacert SSLPoke qwang-hdp5.field.hortonworks.com 8443
Successfully connected

I guess the question now is what truststore beeline is using. Where can I get that information?

avatar
Master Collaborator

Found the answer in another HCC post

https://community.hortonworks.com/questions/16887/beeline-connect-via-knox-ssl-issue.html

The truststore and password of truststore need to be included in the hive conn string.

beeline -u "jdbc:hive2://knoxnode:8443/;ssl=true;sslTrustStore=/root/myLocalTrustStore.jks;trustStorePassword=<password>;transportMode=http;httpPath=gateway/default/hive" -n hadoopadmin -p <password>

The step is like the following. First get the certification from Knox server, then add the cert to a key file that will be used by the client, keep the keypassword handy, that is the password for the connection string. You don't have to add the cert in the java key file.

openssl s_client -connect ${knoxserver}:8443 <<<'' | openssl x509 -out /tmp/knox.crt
keytool -import -keystore myLocalTrustStore.jks -file knox.crt