Created on 05-16-2014 01:58 AM - last edited on 10-04-2019 02:34 PM by ask_bill_brooks
I'm using CDH5. I have set up a Hive Metastore to use SASL. i.e. the hive-site.xml has the following properties
<property> <name>hive.metastore.sasl.enabled</name> <value>true</value> </property> <property> <name>hive.metastore.kerberos.keytab.file</name> <value>/etc/hive/conf/hive.keytab</value> </property> <property> <name>hive.metastore.kerberos.principal</name> <value>hive/hive-metastore.xxxxxxx.com@XXXXXXX.COM</value> </property>
The logs show no errors on starting the hive-metastore service.
I'm trying to run a hive action in an oozie workflow. The oozie-site.xml file has the following property
<property> <name>oozie.credentials.credentialclasses</name> <value>hcat=org.apache.oozie.action.hadoop.HCatCredentials</value> </property>
And the workflow xml file has the credentials tag
<credentials> <credential name='hive_credentials' type='hcat'> <property> <name>hcat.metastore.uri</name> <value>thrift://hive-metastore.xxxxxxx.com:9083</value> </property> <property> <name>hcat.metastore.principal</name> <value>hive/hadoop-metastore.xxxxxxx.com@XXXXXXX.COM</value> </property> </credential> </credentials>
The hive action refers to the credentials using the 'cred' attribute.
<action name="hive" cred="hive_credentials"> <hive xmlns="uri:oozie:hive-action:0.2"> <job-tracker>${jobTracker}</job-tracker> <name-node>${nameNode}</name-node> <job-xml>${appPath}/hive-site.xml</job-xml> <configuration> <property> <name>mapred.job.queue.name</name> <value>${queueName}</value> </property> </configuration> <script>${appPath}/queries.hql</script> </hive> <ok to="pass"/> <error to="fail"/> </action>
When I try to run this workflow, I get the following error.
Exception in addtoJobConf MetaException(message:Could not connect to meta store using any of the URIs provided. Most recent failure: org.apache.thrift.transport.TTransportException: No common protection layer between client and server at org.apache.thrift.transport.TSaslTransport.sendAndThrowMessage(TSaslTransport.java:221) at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:297) at org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37) at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:52) at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:49) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:396) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548) at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49) at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:288) at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:169) at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:109) at org.apache.oozie.action.hadoop.HCatCredentialHelper.getHCatClient(HCatCredentialHelper.java:87) at org.apache.oozie.action.hadoop.HCatCredentialHelper.set(HCatCredentialHelper.java:52) at org.apache.oozie.action.hadoop.HCatCredentials.addtoJobConf(HCatCredentials.java:58) at org.apache.oozie.action.hadoop.JavaActionExecutor.setCredentialTokens(JavaActionExecutor.java:990) at org.apache.oozie.action.hadoop.JavaActionExecutor.submitLauncher(JavaActionExecutor.java:851) at org.apache.oozie.action.hadoop.JavaActionExecutor.start(JavaActionExecutor.java:1071) at org.apache.oozie.command.wf.ActionStartXCommand.execute(ActionStartXCommand.java:217) at org.apache.oozie.command.wf.ActionStartXCommand.execute(ActionStartXCommand.java:62) at org.apache.oozie.command.XCommand.call(XCommand.java:280) at org.apache.oozie.service.CallableQueueService$CompositeCallable.call(CallableQueueService.java:323) at org.apache.oozie.service.CallableQueueService$CompositeCallable.call(CallableQueueService.java:252) at org.apache.oozie.service.CallableQueueService$CallableWrapper.run(CallableQueueService.java:174) at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:895) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:918) at java.lang.Thread.run(Thread.java:662) ) at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:334) at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:169) at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:109) at org.apache.oozie.action.hadoop.HCatCredentialHelper.getHCatClient(HCatCredentialHelper.java:87) at org.apache.oozie.action.hadoop.HCatCredentialHelper.set(HCatCredentialHelper.java:52) at org.apache.oozie.action.hadoop.HCatCredentials.addtoJobConf(HCatCredentials.java:58)
It looks like oozie is not able to connect to the hive metastore using its hive client.
When I start the hive-server2 and try to connect using beeline, I get the same exception message. (i.e. javax.security.sasl.SaslException: No common protection layer between client and server)
I'm able to connect to the metastore using hive cli.
Any idea what could be causing this issue?
Thanks,
Terance.
Created on 08-03-2014 11:43 PM - edited 08-03-2014 11:43 PM
Sorry, the XML appears to have got eaten up. Here's clearer instructions for releases prior to CDH 5.1.0:
# SSH to Oozie machine
# Log on as root
# Do below:
cat > /tmp/hive-site.xml
<configuration><property><name>hadoop.rpc.protection</name><value>privacy</value></property></configuration>
^D
cd /tmp/
jar cf hive-rpc-protection.jar hive-site.xml
mv hive-rpc-protection.jar /var/lib/oozie/
# Restart Oozie server, and retry your WF.
Created on 08-03-2014 11:43 PM - edited 08-03-2014 11:43 PM
Sorry, the XML appears to have got eaten up. Here's clearer instructions for releases prior to CDH 5.1.0:
# SSH to Oozie machine
# Log on as root
# Do below:
cat > /tmp/hive-site.xml
<configuration><property><name>hadoop.rpc.protection</name><value>privacy</value></property></configuration>
^D
cd /tmp/
jar cf hive-rpc-protection.jar hive-site.xml
mv hive-rpc-protection.jar /var/lib/oozie/
# Restart Oozie server, and retry your WF.
Created 08-05-2014 03:29 AM
Created on 08-09-2018 05:43 AM - edited 08-09-2018 07:06 AM
Hi,
Are there any instructions for CDH 5.13.1 ?
I have a similar error message for the same config.
Hadoop RPC Protection: Authenticate
Error Message:
WARN org.apache.oozie.action.hadoop.HCatCredentials: SERVER[myhost] USER[ABC] Group[ABC] APP[ABCDEFGHIJKL] JOB[JOBID] Action[MyAction] Exception in addtoJobConf
org.apache.hive.hcatalog.common.HCatException : 9001 : Exception occurred while processing HCat request : TException while getting delegation token.. Cause : org.apache.thrift.transport.TTransportException: java.net.SocketException: Connection reset
Created 10-04-2019 02:03 PM
I'm getting the same kind of issue/error while running the spark Job.
"JA009 JA009: org.apache.hive.hcatalog.common.HCatException : 9001 : Exception occurred while processing HCat request : TException while getting delegation token.. Cause : org.apache.thrift.transport.TTransportException: java.net.SocketTimeoutException: Read timed out "
We are currently using CDH Version and CManager Version (5.13.3). So can you Please suggest me whether the procedure you provided in Version 5.1.1* above can be applied for 5.13 .
Appreciate your response.
Thanks
Mannan.
Created 10-05-2019 12:43 AM