09-22-2014 10:54 AM
I've been trying to authenticate our users to our Hadoop Cluster. I'm in a (mostly) Windows environment connected to an Active Directory. We're currently testing the setup on CDH 5.1.2 running in a CM managed env. (running CM 5.1.2).
So far, I've managed to :
1-Connect HUE to the LDAP backend
2-Enable SSL for Hue (using a generated certificate for the host that was signed by our IT dept)
3-Connect Hive server 2 to the LDAP backend
But now, I'm stuck at enabling SSL for HS2.
What I've done :
1-Checked "Enable SSL for HiveServer" in the Service-Wide Security, that put "hive.sever2.enable.SSL" to true in the config
2-Set the appropriate path to the SSL Keystore (hive.server2.keystore.path)
3-Set the appropriate password to the SSL Keystore (hive.server2.keystore.password)
4-Imported my certificate in the key store using the keytool -import command
5-Restarted the services
6-In the advanced options of the ODBC driver on my windows machine, I added the correct path to my PEM format file containing my SSL CAs
Now, whenever I try to connect to the 10000 port using the ODBC driver (Cloudera Hive driver v2.5.10), I get the following error :
Client Side : Error from Hive: SSL_connect: sslv3 alert handshake failure.
Server Side : java.lang.RuntimeException: org.apache.thrift.transport.TTransportException: javax.net.ssl.SSLHandshakeException: no cipher suites in common
Any ideas what I missed?
09-22-2014 02:46 PM
Turns out that the error was misleading.
The way I imported my certificate was wrong. I needed to import the cert with the key!
see this link : http://cunning.sharp.fm/2008/06/importing_private_keys_into_a.html