Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

Sqoop Oozie Coordinator job killed issue

avatar
Explorer

I am trying to export data from Postgres to Hive through Sqoop in Oozie Coordinator Jobs. I set the frequency to 5 MINS.
The job is started after the start time and the data is loaded into hive for the first time and after 5 mins frequency the job is started and it is killed.
What is the problem here ?
Below are my properties and xml file:

 

Coordinator.xml file:

<coordinator-app name="coordinator1" frequency="${frequency}" start="${startTime}" end="${endTime}"
timezone="${timezone}" xmlns="uri:oozie:coordinator:0.4">
<action>
<workflow>
<app-path>${workflowPath}</app-path>
</workflow>
</action>
</coordinator-app>

 

Job.properties:

frequency=5
startTime=2019-08-07T10:22Z
endTime=2019-08-07T10:32Z
timezone=UTC
nameNode=hdfs://HadoopNode3:8020
jobTracker=HadoopNode3:8032
queueName=default
oozie.action.sharelib.for.sqoop=sqoop,hive
oozie.use.system.libpath=true
workflowPath=/user/oozieschedules/sqoop/coordinator
oozie.coord.application.path=/user/oozieschedules/sqoop/coordinator

 

Workflow.xml:

<workflow-app name="sqoop-coordwf" xmlns="uri:oozie:workflow:0.4">
<start to="sqoop-coordaction"/>
<action name="sqoop-coordaction">
<sqoop xmlns="uri:oozie:sqoop-action:0.2">
<job-tracker>${jobTracker}</job-tracker>
<name-node>${nameNode}</name-node>
<prepare>
<delete path="hdfs://HadoopNode3:8020/user/cloudera/student"/>
</prepare>
<configuration>
<property>
<name>mapred.job.queue.name</name>
<value>${queueName}</value>
</property>
</configuration>
<arg>import</arg>
<arg>--connect</arg>
<arg>jdbc:postgresql://ipaddress:5432/inet</arg>
<arg>--username</arg>
<arg>postgres</arg>
<arg>--password</arg>
<arg>postgres</arg>
<arg>--table</arg>
<arg>student</arg>
<arg>--hive-import</arg>
<arg>--hive-table</arg>
<arg>test</arg>
<arg>--create-hive-table</arg>
<arg>-m</arg>
<arg> 1</arg>
<file>/user/oozieschedules/sqoop/coordinator/hive-site.xml#hive-site.xml</file>

</sqoop>
<ok to="end"/>
<error to="fail"/>
</action>
<kill name="fail">
<message>Failed, Error Message</message>
</kill>
<end name="end"/>
</workflow-app>

 

could anyone please resolve the issue .....

1 ACCEPTED SOLUTION

avatar
Super Guru

Hi,

Without the actual error message, it would be quite hard to troubleshoot issues. For future reference, please share the error log if possible, it helps to determine the cause.

If you want me to guess, I would think it might be caused by the fact that you create same table every time when workflow is triggered. Do you intend to overwrite the table or append data to the table?

Cheers

Eric

View solution in original post

3 REPLIES 3

avatar
Super Guru

Hi,

Without the actual error message, it would be quite hard to troubleshoot issues. For future reference, please share the error log if possible, it helps to determine the cause.

If you want me to guess, I would think it might be caused by the fact that you create same table every time when workflow is triggered. Do you intend to overwrite the table or append data to the table?

Cheers

Eric

avatar
Explorer

I want to append the data to the same table.

 

Log Error:

 

Intercepting System.exit(1)

<<< Invocation of Main class completed <<<

Failing Oozie Launcher, Main class [org.apache.oozie.action.hadoop.SqoopMain], exit code [1]

Oozie Launcher failed, finishing Hadoop job gracefully.

 

 

The log file contains the above error only.

Please help me in resolving the issue..

avatar
Explorer

Thanks a lot.

 

I removed the create hive table statement.The job is successfully executed now.