Support Questions

Find answers, ask questions, and share your expertise

How to Setup HiveServer2 Authentication with LDAP SSL (No Knox)

 
1 ACCEPTED SOLUTION

Here is how I got it to work.

In order for tools such as Hive, Beeline to use LDAPs, you need to make a global change in HADOOP_OPTS for CA Certs, so that it is loaded with Hadoop in general, assuming you imported the cert (self-signed) into a cacert located in /etc/pki/java/cacerts

In HDFS-> Configs -> Hadoop Env Template add the following:

export HADOOP_OPTS="-Djava_net_preferIPv4Stack=true =Djavax.net.ssl.trustStore=/etc/pki/java/cacerts -Djavax.net.ssl.trustStorePassword=changeit ${HADOOP_OPTS}"

276-screen-shot-2015-10-20-at-122115-pm.png

Note: Components like Knox and Ranger does not use the hadoop_env and needs its own config to be set for LDAP SSL and a manually restart.

Why a manual restart? Because it seems when you start with Ambari, there is no way to manual set user options so that Ambari can pick up these settings and use in java process of Ranger and Knox when it starts. Only when Ranger and Knox is started manually, when restarting is the certs picked up.

Note also Hive View does not work with LDAP or LDAP ssl.

View solution in original post

8 REPLIES 8

Master Collaborator

Personally I haven't setup LDAP SSL but here are the properties you can set in hive-site.xml.

hive.server2.authentication = LDAP
hive.server2.authentication.ldap.url = <LDAP URL>
hive.server2.authentication.ldap.baseDN = <LDAP Base DN>
hive.server2.use.SSL = true
hive.server2.keystore.path = <KEYSTORE FILE PATH>
hive.server2.keystore.password = <KEYSTORE PASSWORD>

These keystore.path and the keystore.password is ONLY for SSL encryption. It has nothing to do with LDAP SSL

Both LDAP and SSL are covered in the Apache Hive docs:

Authentication/Security Configuration

Setting up SSL Encryption

Isn't the ssl encryption different from LDAPs for authentication? The key path is different

You're right. For LDAPS you just need to make sure the LDAP server's SSL certificate is trusted by the JVM that runs HS2. If using a self-signed (or otherwise untrusted) cert, import it into the corresponding cacerts, usually under $JAVA_HOME/jre/lib/security/cacerts

Here is how I got it to work.

In order for tools such as Hive, Beeline to use LDAPs, you need to make a global change in HADOOP_OPTS for CA Certs, so that it is loaded with Hadoop in general, assuming you imported the cert (self-signed) into a cacert located in /etc/pki/java/cacerts

In HDFS-> Configs -> Hadoop Env Template add the following:

export HADOOP_OPTS="-Djava_net_preferIPv4Stack=true =Djavax.net.ssl.trustStore=/etc/pki/java/cacerts -Djavax.net.ssl.trustStorePassword=changeit ${HADOOP_OPTS}"

276-screen-shot-2015-10-20-at-122115-pm.png

Note: Components like Knox and Ranger does not use the hadoop_env and needs its own config to be set for LDAP SSL and a manually restart.

Why a manual restart? Because it seems when you start with Ambari, there is no way to manual set user options so that Ambari can pick up these settings and use in java process of Ranger and Knox when it starts. Only when Ranger and Knox is started manually, when restarting is the certs picked up.

Note also Hive View does not work with LDAP or LDAP ssl.

Explorer

@amcbarnett@hortonworks.com Can you confirm you really needed the -D settings after you imported your cert into the truststore? These arguments you added are the defaults.

@carter@hortonworks.com Yes, the only way it worked is when I used the -D settings.

However I have since been told that in order for Hadoop to use the cert, we should import into $JAVA_HOME/jre/lib/security/cacerts instead of /etc/pki/java/cacerts which we thought was the default.

So apparently if you are using any trustStore besides $JAVA_HOME/jre/lib/security/cacerts you would need the -D settings.

I haven't had a chance to test this as the folks I am working with got it to work with the -D settings, using /etc/java/cacerts and do not want to make any further changes.