Member since
11-05-2018
8
Posts
0
Kudos Received
1
Solution
My Accepted Solutions
Title | Views | Posted |
---|---|---|
8649 | 11-28-2018 10:19 AM |
11-28-2018
10:19 AM
I've since found the following: https://www.cloudera.com/documentation/enterprise/5-13-x/topics/cdh_oozie_sqoop_jdbc.html Which suggests loading the data to HDFS then creating a table with a hive2 action. This has since solved my problem and I have a working job. I appreciate your help!
... View more
11-28-2018
09:55 AM
I'm still getting the same error.
... View more
11-28-2018
09:54 AM
<workflow-app name="Sqoop and Hive2 Test" xmlns="uri:oozie:workflow:0.5">
<start to="sqoop-a3af"/>
<kill name="Kill">
<message>Action failed, error message[${wf:errorMessage(wf:lastErrorNode())}]</message>
</kill>
<action name="sqoop-a3af">
<sqoop xmlns="uri:oozie:sqoop-action:0.2">
<job-tracker>${jobTracker}</job-tracker>
<name-node>${nameNode}</name-node>
<command>import --m 1 --connect jdbc:mysql://host:name/navigator --username root --password-file /user/32230704/passwords --table HIVE_AUDIT_EVENTS_2018_10_29 --delete-target-dir --target-dir=/user/hive/warehouse/das_audit_navigator_db.db/HIVE_AUDIT_EVENTS_2018_10_29 --hive-import --hive-overwrite --hive-table das_audit_navigator_db.amore_audit_test</command>
<file>/user/32230704/oozie-oozi/hive-site.xml#hive-site.xml</file>
</sqoop>
<ok to="End"/>
<error to="Kill"/>
</action>
<end name="End"/>
</workflow-app>
... View more
11-28-2018
09:51 AM
I add the file path to the parameter tab provided by hue at runtime. The job.xml does not reflect this change and gets overwritten each time I submit
... View more
11-28-2018
09:38 AM
<workflow-app name="Sqoop and Hive2 Test" xmlns="uri:oozie:workflow:0.5">
<start to="sqoop-a3af"/>
<action name="sqoop-a3af">
<sqoop xmlns="uri:oozie:sqoop-action:0.2">
<job-tracker>${jobTracker}</job-tracker>
<name-node>${nameNode}</name-node>
<command>import --m 1 --connect jdbc:mysql://host:port/navigator --username user --password-file /user/32230704/passwords --table HIVE_AUDIT_EVENTS_2018_10_29 --delete-target-dir --target-dir=/user/hive/warehouse/das_audit_navigator_db.db/HIVE_AUDIT_EVENTS_2018_10_29 --hive-import --hive-overwrite --hive-table das_audit_navigator_db.amore_audit_test</command>
<file>/user/32230704/oozie-oozi/hive-site.xml#hive-site.xml</file>
</sqoop>
<ok to="End"/>
<error to="Kill"/>
</action>
<kill name="Kill">
<message>Action failed, error message[${wf:errorMessage(wf:lastErrorNode())}]</message>
</kill>
<end name="End"/>
</workflow-app> oozie.use.system.libpath=True
send_email=False
dryrun=False
credentials={u'hcat': {'xml_name': u'hcat', 'properties': [('hcat.metastore.uri', u'thrift://host:port'), ('hcat.metastore.principal', u'hive/host@host')]}, u'hive2': {'xml_name': u'hive2', 'properties': [('hive2.jdbc.url', u'jdbc:hive2://host:port/default;ssl=true;sslTrustStore=/opt/cloudera/security_ca/jks/cloudera.truststore'), ('hive2.server.principal', u'hive/host@host')]}, u'hbase': {'xml_name': u'hbase', 'properties': []}}
nameNode=hdfs://name
submit_single_action=True
jobTracker=yarnRM
security_enabled=True path to hive-site.xml /user/32230704/oozie-oozi/hive-site.xml
... View more
11-28-2018
06:59 AM
Thanks for the response. I've placed my hive-site.xml into the file properties of the sqoop oozie job as requested. I made sure to download the client version. I'm still getting the following error message from stderr: FAILED: SemanticException org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
Intercepting System.exit(1)
Failing Oozie Launcher, Main class [org.apache.oozie.action.hadoop.SqoopMain], exit code [1] My stdout logs are as follows: <<< Invocation of Sqoop command completed <<<
Hadoop Job IDs executed by Sqoop: job_1541100802669_1684
Intercepting System.exit(1)
<<< Invocation of Main class completed <<<
Failing Oozie Launcher, Main class [org.apache.oozie.action.hadoop.SqoopMain], exit code [1]
Oozie Launcher failed, finishing Hadoop job gracefully
Oozie Launcher, uploading action data to HDFS sequence file:
Successfully reset security manager from org.apache.oozie.action.hadoop.LauncherSecurityManager@65dbaa54 to null
Oozie Launcher ends
... View more
11-08-2018
10:10 AM
When I try to run a single sqoop command with the hive table creation included in a single action I get the following error: Logging initialized using configuration in jar:file:/u05/hadoop/yarn/nm/filecache/467/hive-exec.jar!/hive-log4j.properties
FAILED: SemanticException org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
Intercepting System.exit(1)
... View more
11-07-2018
08:40 AM
Hi All,
I've been configuring an Oozie hive2 action that simply drops a pre-existing table. I'm running this command from an .hql file and scheduling it with oozie on Hue.
For no apparent reason this job will sometimes work, but will also sporadically throw an error (stderr logs below):
Unknown HS2 problem when communicating with Thrift server.
Error: Could not open client transport with JDBC Uri: jdbc:hive2://....
on port 10000 (state=08S01,code=0)
This ends with a system.exit(2) command. I've fiddled around with retry properties (max and interval) within the Hue workflow editor and this would appear to have an affect on my success rate, but the job still will not reliably work.
The relevant stdout logs are below:
<<< Invocation of Beeline command completed <<<
No child hadoop job is executed.
Intercepting System.exit(2)
<<< Invocation of Main class completed <<<
Failing Oozie Launcher, Main class [org.apache.oozie.action.hadoop.Hive2Main], exit code [2]
Oozie Launcher failed, finishing Hadoop job gracefully
I should mention that is is running on a Kerberos Cluster. CDH-5.13.1, Hive2, Sqoop Version 1.4.6.
Are there any hive parameters that could be preventing a consistent, realible connection?? I've also checked the yarn logs, but to no avail. Has anyone experienced this error before??
... View more
Labels:
- Labels:
-
Apache Hive
-
Apache Oozie
-
Cloudera Hue
-
Kerberos