Support Questions
Find answers, ask questions, and share your expertise
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

oozie workflow failed in sqoop command


oozie workflow failed in sqoop command

Hello all, I am successfully able to run the sqoop command. But when I added to oozie workflow xml I am not able to do with oozie. I received STATUS as KILLED. The error code is JA018.

Here is my workflow:

<workflow-app xmlns="uri:oozie:workflow:0.4" name="oozie-wf">

<start to="sqoop-wf"/>

<action name="sqoop-wf"> <sqoop xmlns="uri:oozie:sqoop-action:0.2">



<command>import --connect jdbc:oracle:thin:@dbservername:1521:database --username user --password passwprd --query "SELECT TASKID FROM TASKS WHERE TASKTIMESTAMP BETWEEN TO_DATE('2017-02-13 00:00:00', 'yyyy-mm-dd hh24:mi:ss') AND TO_DATE('2017-02-13 23:59:59', 'yyyy-mm-dd hh24:mi:ss') AND \$CONDITIONS" --fields-terminated-by '|' -m 1 --target-dir /dbdata/mytest</command>


<ok to="end"/>

<error to="fail"/>


<kill name="fail">

<message>Failed, Error Message[${wf:errorMessage(wf:lastErrorNode())}]</message>

</kill> <end name="end"/> </workflow-app>

Could someone please help me? I am new to oozie and I stuck with this error for awhile.

Thanks in advance


Re: oozie workflow failed in sqoop command


are you doing this on HDP 2.5? Please go to the job in Resource Manager and paste the actual error. Also, I have an article you can review for some common gotchas on HDP 2.5, specifically with Hive import. Doesn't look like that's what you're doing but your error code points to Hive.


Re: oozie workflow failed in sqoop command

hi Arvits;

I am using HDP I am trying to import data from database into hdfs using oozie workflow. I have questions to ask you, that I execute the oozie command as mapred user but I keep getting permission denied: user=mapred, access=READ. I have directory owned by mapred (drwxrwxrwx - mapred hadoop 0 2016-11-01 09:40 /mr-history) however, when oozie job started somewhere generate the job by hdfs and changed mapred user to read only. Can you please show me where I can set or change the hdfs permission to mapred? Pls see below log:

Caused by: org.apache.hadoop.ipc.RemoteException( Permission denied: user=mapred, access=READ, inode="/mr-history/tmp/hdfs/job_1487105716370_1426.summary":hdfs:hdfs:-rwxrwx---
	at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(
	at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(
	at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(
Don't have an account?
Coming from Hortonworks? Activate your account here