Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

Oozie/hive-server2 not able to connect to hive-metastore when SASL enabled

avatar
Explorer

I'm using CDH5. I have set up a Hive Metastore to use SASL. i.e. the hive-site.xml has the following properties

 

<property>
  <name>hive.metastore.sasl.enabled</name>
  <value>true</value>
</property>

<property>
  <name>hive.metastore.kerberos.keytab.file</name>
  <value>/etc/hive/conf/hive.keytab</value>
</property>

<property>
  <name>hive.metastore.kerberos.principal</name>
  <value>hive/hive-metastore.xxxxxxx.com@XXXXXXX.COM</value>
</property>

 

The logs show no errors on starting the hive-metastore service.

 

I'm trying to run a hive action in an oozie workflow. The oozie-site.xml file has the following property

 

    <property>
        <name>oozie.credentials.credentialclasses</name>
        <value>hcat=org.apache.oozie.action.hadoop.HCatCredentials</value>
    </property>

 

And the workflow xml file has the credentials tag

 

    <credentials>
        <credential name='hive_credentials' type='hcat'>
               <property>
                    <name>hcat.metastore.uri</name>
                    <value>thrift://hive-metastore.xxxxxxx.com:9083</value>
               </property>
               <property>
                    <name>hcat.metastore.principal</name>
                    <value>hive/hadoop-metastore.xxxxxxx.com@XXXXXXX.COM</value>
               </property>
         </credential>
    </credentials>

 

The hive action refers to the credentials using the 'cred' attribute.

 

    <action name="hive" cred="hive_credentials">
        <hive xmlns="uri:oozie:hive-action:0.2">
            <job-tracker>${jobTracker}</job-tracker>
            <name-node>${nameNode}</name-node>
            <job-xml>${appPath}/hive-site.xml</job-xml>
            <configuration>
                <property>
                    <name>mapred.job.queue.name</name>
                    <value>${queueName}</value>
                </property>
            </configuration>
            <script>${appPath}/queries.hql</script>
        </hive>
        <ok to="pass"/>
        <error to="fail"/>
    </action>

 

When I try to run this workflow, I get the following error.

 

Exception in addtoJobConf
MetaException(message:Could not connect to meta store using any of the URIs provided. Most recent failure: org.apache.thrift.transport.TTransportException: No common protection layer between client and server
        at org.apache.thrift.transport.TSaslTransport.sendAndThrowMessage(TSaslTransport.java:221)
        at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:297)
        at org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37)
        at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:52)
        at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:49)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:396)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
        at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49)
        at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:288)
        at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:169)
        at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:109)
        at org.apache.oozie.action.hadoop.HCatCredentialHelper.getHCatClient(HCatCredentialHelper.java:87)
        at org.apache.oozie.action.hadoop.HCatCredentialHelper.set(HCatCredentialHelper.java:52)
        at org.apache.oozie.action.hadoop.HCatCredentials.addtoJobConf(HCatCredentials.java:58)
        at org.apache.oozie.action.hadoop.JavaActionExecutor.setCredentialTokens(JavaActionExecutor.java:990)
        at org.apache.oozie.action.hadoop.JavaActionExecutor.submitLauncher(JavaActionExecutor.java:851)
        at org.apache.oozie.action.hadoop.JavaActionExecutor.start(JavaActionExecutor.java:1071)
        at org.apache.oozie.command.wf.ActionStartXCommand.execute(ActionStartXCommand.java:217)
        at org.apache.oozie.command.wf.ActionStartXCommand.execute(ActionStartXCommand.java:62)
        at org.apache.oozie.command.XCommand.call(XCommand.java:280)
        at org.apache.oozie.service.CallableQueueService$CompositeCallable.call(CallableQueueService.java:323)
        at org.apache.oozie.service.CallableQueueService$CompositeCallable.call(CallableQueueService.java:252)
        at org.apache.oozie.service.CallableQueueService$CallableWrapper.run(CallableQueueService.java:174)
        at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:895)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:918)
        at java.lang.Thread.run(Thread.java:662)
)
        at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:334)
        at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:169)
        at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:109)
        at org.apache.oozie.action.hadoop.HCatCredentialHelper.getHCatClient(HCatCredentialHelper.java:87)
        at org.apache.oozie.action.hadoop.HCatCredentialHelper.set(HCatCredentialHelper.java:52)
        at org.apache.oozie.action.hadoop.HCatCredentials.addtoJobConf(HCatCredentials.java:58)

 

It looks like oozie is not able to connect to the hive metastore using its hive client.

When I start the hive-server2 and try to connect using beeline, I get the same exception message. (i.e. javax.security.sasl.SaslException: No common protection layer between client and server)

I'm able to connect to the metastore using hive cli.

Any idea what could be causing this issue?

 

Thanks,

Terance.

1 ACCEPTED SOLUTION

avatar
Mentor

Sorry, the XML appears to have got eaten up. Here's clearer instructions for releases prior to CDH 5.1.0:

 

# SSH to Oozie machine
# Log on as root
# Do below:
cat > /tmp/hive-site.xml
<configuration><property><name>hadoop.rpc.protection</name><value>privacy</value></property></configuration>
^D
cd /tmp/
jar cf hive-rpc-protection.jar hive-site.xml
mv hive-rpc-protection.jar /var/lib/oozie/
# Restart Oozie server, and retry your WF.

View solution in original post

14 REPLIES 14

avatar
Mentor
Where exactly is the hive.metastore.sasl.enabled property applied? Are you certain it is applied to the running HiveMetaStore server?

Does a regular Hive CLI configured with hive.metastore.uris instead of DB properties run properly (i.e. show tables, etc. work fine)?

avatar
Explorer

The hive.metastore.sasl.enabled is applied to hive-site.xml in /etc/hive/conf/hive-site.xml. The same hive-site.xml is also present in the oozie app directory and is referred to in workflow.xml as 

 

<job-xml>${appPath}/hive-site.xml</job-xml>

 

If the hive-site.xml has hive.metastore.uris property, i think the Hive CLI will use the metastore uri and not connect to DB directly. When this property is present in hive-site.xml along with hive.metastore.sasl.enabled, hive.metastore.kerberos.keytab.file and hive.metastore.kerberos.principal, I'm able to use Hive CLI and run show tables, etc. So I guess Hive CLI is able to talk to metastore, but oozie is not.

avatar
Mentor
To have Oozie talk to a secured HiveMetaStore, you need to follow the
credentials procedure detailed at
http://archive.cloudera.com/cdh5/cdh/5/oozie/DG_UnifiedCredentialsModule.html

Basically:
1. Enable Credentials of HCat type at the Oozie Server (via service
configuration, requires restart).
2. Add a credentials section to your workflow. This section also
configures the HMS location and the SPN.
3. Add the credential name to the action that requires a token for
authentication (your hive action).

avatar
Explorer
Thanks Harsh, I've done these these. Please see my first post.

avatar
Mentor
Ugh, very sorry. I replied the above post via email, thinking it was a wholly new question.

OK, so looking further in, the Oozie server is already using a TSaslTransport for the HMS client connection, so the property is likely not the problem.

Do you perhaps have "hadoop.rpc.protection", on your cluster, set to a non-default value of "privacy" (for traffic encryption in HDFS and the like)?

avatar
Explorer
Yes Harsh, the 'hadoop.rpc.protection' property is set to 'privacy'. Does this property affect communication between hive metastore and its client?

avatar
Mentor
It does lead to this error from another issue we've seen internally
(with CM canaries over HMS). Does adding the hadoop.rpc.protection
property pair (with "privacy" option set), to your passed
config file in the WF help with getting the HMS connection through?

avatar
Explorer

You mean add the 'hadoop.rpc.protection' property with value 'privacy' in the hive-site.xml file in that is used in the workflow xml? I tried this and I'm still getting the same exception.

avatar
Mentor
Thank you for trying that out! I investigated further into the code
and it turns out you are hitting
https://issues.apache.org/jira/browse/OOZIE-1593, which we have
backported in our CDH 5.1.0 release. With that fix added, the
Credentials code will properly load the hadoop.rpc.protection when
making HMS connections. But it does not appear do so in prior
releases, even if you were to specify it as part of your action
configuration.

This is the fixed code line, if you are interested in taking a look:
https://github.com/cloudera/oozie/blob/cdh5.1.0-release/core/src/main/java/org/apache/oozie/action/h...
(Note the missing line in 5.0.0 sources, at
https://github.com/cloudera/oozie/blob/cdh5.0.0-release/core/src/main/java/org/apache/oozie/action/h...

If you are unable to upgrade immediately, you can perhaps try
something like the below as one way of workaround for this:

# SSH to Oozie machine
# Log on as root
# Do below:
cat > /tmp/hive-site.xml
hadoop.rpc.protectionprivacy
^D
cd /tmp/
jar cf hive-rpc-protection.jar hive-site.xml
mv hive-rpc-protection.jar /var/lib/oozie/
# Restart Oozie server, and retry your WF.