Support Questions

Find answers, ask questions, and share your expertise

Hive2 Oozie Action Doesn't Consistently Connect

avatar
Explorer

Hi All,

 

I've been configuring an Oozie hive2 action that simply drops a pre-existing table. I'm running this command from an .hql file and scheduling it with oozie on Hue.

 

For no apparent reason this job will sometimes work, but will also sporadically throw an error (stderr logs below):

Unknown HS2 problem when communicating with Thrift server.
Error: Could not open client transport with JDBC Uri: jdbc:hive2://.... 

on port 10000 (state=08S01,code=0)

This ends with a system.exit(2) command. I've fiddled around with retry properties (max and interval) within the Hue workflow editor and this would appear to have an affect on my success rate, but the job still will not reliably work.

 

The relevant stdout logs are below:

 

<<< Invocation of Beeline command completed <<<

No child hadoop job is executed.
Intercepting System.exit(2)

<<< Invocation of Main class completed <<<

Failing Oozie Launcher, Main class [org.apache.oozie.action.hadoop.Hive2Main], exit code [2]

Oozie Launcher failed, finishing Hadoop job gracefully

 

I should mention that is is running on a Kerberos Cluster. CDH-5.13.1, Hive2, Sqoop Version 1.4.6.

 

Are there any hive parameters that could be preventing a consistent, realible connection?? I've also checked the yarn logs, but to no avail. Has anyone experienced this error before??

 

 

2 ACCEPTED SOLUTIONS

avatar
Explorer

I've since found the following:

 

https://www.cloudera.com/documentation/enterprise/5-13-x/topics/cdh_oozie_sqoop_jdbc.html

 

Which suggests loading the data to HDFS then creating a table with a hive2 action. This has since solved my problem and I have a working job. I appreciate your help!

 

 

View solution in original post

avatar
Expert Contributor

Hi,

 

There are 2 options that you can do.

 

1. Sqoop import command with + hive import ( 1 step hive import)

2. Sqoop import to HDFS and create hive table on top of it.

 

Either of the above option will reach the desired results.

 

Let me know if you have any questions.

 

Regards

Nitish

View solution in original post

11 REPLIES 11

avatar
Explorer

I've since found the following:

 

https://www.cloudera.com/documentation/enterprise/5-13-x/topics/cdh_oozie_sqoop_jdbc.html

 

Which suggests loading the data to HDFS then creating a table with a hive2 action. This has since solved my problem and I have a working job. I appreciate your help!

 

 

avatar
Expert Contributor

Hi,

 

There are 2 options that you can do.

 

1. Sqoop import command with + hive import ( 1 step hive import)

2. Sqoop import to HDFS and create hive table on top of it.

 

Either of the above option will reach the desired results.

 

Let me know if you have any questions.

 

Regards

Nitish