Support Questions

Find answers, ask questions, and share your expertise

Oozie/hive-server2 not able to connect to hive-metastore when SASL enabled

avatar
Explorer

I'm using CDH5. I have set up a Hive Metastore to use SASL. i.e. the hive-site.xml has the following properties

 

<property>
  <name>hive.metastore.sasl.enabled</name>
  <value>true</value>
</property>

<property>
  <name>hive.metastore.kerberos.keytab.file</name>
  <value>/etc/hive/conf/hive.keytab</value>
</property>

<property>
  <name>hive.metastore.kerberos.principal</name>
  <value>hive/hive-metastore.xxxxxxx.com@XXXXXXX.COM</value>
</property>

 

The logs show no errors on starting the hive-metastore service.

 

I'm trying to run a hive action in an oozie workflow. The oozie-site.xml file has the following property

 

    <property>
        <name>oozie.credentials.credentialclasses</name>
        <value>hcat=org.apache.oozie.action.hadoop.HCatCredentials</value>
    </property>

 

And the workflow xml file has the credentials tag

 

    <credentials>
        <credential name='hive_credentials' type='hcat'>
               <property>
                    <name>hcat.metastore.uri</name>
                    <value>thrift://hive-metastore.xxxxxxx.com:9083</value>
               </property>
               <property>
                    <name>hcat.metastore.principal</name>
                    <value>hive/hadoop-metastore.xxxxxxx.com@XXXXXXX.COM</value>
               </property>
         </credential>
    </credentials>

 

The hive action refers to the credentials using the 'cred' attribute.

 

    <action name="hive" cred="hive_credentials">
        <hive xmlns="uri:oozie:hive-action:0.2">
            <job-tracker>${jobTracker}</job-tracker>
            <name-node>${nameNode}</name-node>
            <job-xml>${appPath}/hive-site.xml</job-xml>
            <configuration>
                <property>
                    <name>mapred.job.queue.name</name>
                    <value>${queueName}</value>
                </property>
            </configuration>
            <script>${appPath}/queries.hql</script>
        </hive>
        <ok to="pass"/>
        <error to="fail"/>
    </action>

 

When I try to run this workflow, I get the following error.

 

Exception in addtoJobConf
MetaException(message:Could not connect to meta store using any of the URIs provided. Most recent failure: org.apache.thrift.transport.TTransportException: No common protection layer between client and server
        at org.apache.thrift.transport.TSaslTransport.sendAndThrowMessage(TSaslTransport.java:221)
        at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:297)
        at org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37)
        at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:52)
        at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:49)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:396)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
        at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49)
        at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:288)
        at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:169)
        at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:109)
        at org.apache.oozie.action.hadoop.HCatCredentialHelper.getHCatClient(HCatCredentialHelper.java:87)
        at org.apache.oozie.action.hadoop.HCatCredentialHelper.set(HCatCredentialHelper.java:52)
        at org.apache.oozie.action.hadoop.HCatCredentials.addtoJobConf(HCatCredentials.java:58)
        at org.apache.oozie.action.hadoop.JavaActionExecutor.setCredentialTokens(JavaActionExecutor.java:990)
        at org.apache.oozie.action.hadoop.JavaActionExecutor.submitLauncher(JavaActionExecutor.java:851)
        at org.apache.oozie.action.hadoop.JavaActionExecutor.start(JavaActionExecutor.java:1071)
        at org.apache.oozie.command.wf.ActionStartXCommand.execute(ActionStartXCommand.java:217)
        at org.apache.oozie.command.wf.ActionStartXCommand.execute(ActionStartXCommand.java:62)
        at org.apache.oozie.command.XCommand.call(XCommand.java:280)
        at org.apache.oozie.service.CallableQueueService$CompositeCallable.call(CallableQueueService.java:323)
        at org.apache.oozie.service.CallableQueueService$CompositeCallable.call(CallableQueueService.java:252)
        at org.apache.oozie.service.CallableQueueService$CallableWrapper.run(CallableQueueService.java:174)
        at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:895)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:918)
        at java.lang.Thread.run(Thread.java:662)
)
        at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:334)
        at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:169)
        at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:109)
        at org.apache.oozie.action.hadoop.HCatCredentialHelper.getHCatClient(HCatCredentialHelper.java:87)
        at org.apache.oozie.action.hadoop.HCatCredentialHelper.set(HCatCredentialHelper.java:52)
        at org.apache.oozie.action.hadoop.HCatCredentials.addtoJobConf(HCatCredentials.java:58)

 

It looks like oozie is not able to connect to the hive metastore using its hive client.

When I start the hive-server2 and try to connect using beeline, I get the same exception message. (i.e. javax.security.sasl.SaslException: No common protection layer between client and server)

I'm able to connect to the metastore using hive cli.

Any idea what could be causing this issue?

 

Thanks,

Terance.

1 ACCEPTED SOLUTION

avatar
Mentor

Sorry, the XML appears to have got eaten up. Here's clearer instructions for releases prior to CDH 5.1.0:

 

# SSH to Oozie machine
# Log on as root
# Do below:
cat > /tmp/hive-site.xml
<configuration><property><name>hadoop.rpc.protection</name><value>privacy</value></property></configuration>
^D
cd /tmp/
jar cf hive-rpc-protection.jar hive-site.xml
mv hive-rpc-protection.jar /var/lib/oozie/
# Restart Oozie server, and retry your WF.

View solution in original post

14 REPLIES 14

avatar
Mentor

Sorry, the XML appears to have got eaten up. Here's clearer instructions for releases prior to CDH 5.1.0:

 

# SSH to Oozie machine
# Log on as root
# Do below:
cat > /tmp/hive-site.xml
<configuration><property><name>hadoop.rpc.protection</name><value>privacy</value></property></configuration>
^D
cd /tmp/
jar cf hive-rpc-protection.jar hive-site.xml
mv hive-rpc-protection.jar /var/lib/oozie/
# Restart Oozie server, and retry your WF.

avatar
Explorer
Thanks Harsh, that worked.

avatar
New Contributor

Hi,

 

Are there any instructions for CDH 5.13.1 ?

 

I have a similar error message for the same config. 

 

Hadoop RPC Protection: Authenticate

 

Error Message:

WARN org.apache.oozie.action.hadoop.HCatCredentials: SERVER[myhost] USER[ABC] Group[ABC] APP[ABCDEFGHIJKL] JOB[JOBID] Action[MyAction] Exception in addtoJobConf
org.apache.hive.hcatalog.common.HCatException : 9001 : Exception occurred while processing HCat request : TException while getting delegation token.. Cause : org.apache.thrift.transport.TTransportException: java.net.SocketException: Connection reset

 

avatar
Contributor

@Harsh J 

 

I'm getting the same kind of issue/error while running the spark Job. 

"JA009    JA009: org.apache.hive.hcatalog.common.HCatException : 9001 : Exception occurred while processing HCat request : TException while getting delegation token.. Cause : org.apache.thrift.transport.TTransportException: java.net.SocketTimeoutException: Read timed out "

We are currently using CDH Version and CManager Version (5.13.3). So can you Please suggest me whether the procedure you provided in Version 5.1.1* above can be applied for 5.13 .

 Appreciate your response.

 

Thanks

Mannan.

avatar
Mentor
The original issue described here is not applicable to your version. In
your case it could simply be a misconfiguration that's causing oozie to not
load the right hive configuration required to talk to the hive service. Try
enabling debug logging on the oozie server if you are unable to find an
error in it. Also try to locate files or jars in your workflow that may be
supplying an invalid hive client XML.