Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

​Every time I run sqoop command using oozie, I get Main class [org.apache.oozie.action.hadoop.SqoopMain], exit code [1] error.

​Every time I run sqoop command using oozie, I get Main class [org.apache.oozie.action.hadoop.SqoopMain], exit code [1] error.

New Contributor

Every time I run sqoop command using oozie, I get Main class [org.apache.oozie.action.hadoop.SqoopMain], exit code [1] error.

  1. I have tried putting 'mysql connector' jar under lib folder, inside the folder that contains workflow.xml and have referenced it in workfow.xml too.
  2. I have also put 'mysql connector' jar under /user/oozie/share/lib/sqoop and /user/oozie/share/lib/lib_20151118030154/sqoop and /user/oozie/share/lib and /user/oozie/share/lib/sqoop/lib
  3. And have added oozie.use.system.libpath=true in job.properties file.

I have restarted the server after placing the jar files each time before running the oozie command.

Additionally I have tried downloading oozie-core/2.3.2-cdh3u3 jar fromhttp://grepcode.com/file/repository.cloudera.com/content/repositories/releases/com.yahoo.oozie/oozie... which has sqoopmain class and added to all possible paths where oozie-core jars are kept in the entire system. Even refrenced this jar file in workflow.xml under archive after placing in the same lib folder as 1. Still no luck.

oozie is able to run map-reduce programs and other programs that I have mapped in workflow.xml but fails with the status as KILLED in hue for sqoop jobs.

job.properties

nameNode=hdfs://quickstart.cloudera:8020
jobTracker=quickstart.cloudera:8032
queueName=default
examplesRoot=examples
oozie.libpath=${nameNode}/user/oozie/share/lib
oozie.use.system.libpath=true
oozie.wf.rerun.failnodes=true
user.name=cloudera
oozie.wf.application.path=${nameNode}/user/${user.name}/${examplesRoot}/apps/map-reduce/workflow.xml
outputDir=outputOzzie

workflow.xml

<workflow-app xmlns="uri:oozie:workflow:0.2" name="map-reduce-wf">
    <start to="mr-node"/>
    <action name="sqoopAction">
    <sqoop xmlns="uri:oozie:sqoop-action:0.2">
        <job-tracker>${jobTracker}</job-tracker>
        <name-node>${nameNode}</name-node>
        <command>import --connect jdbc:mysql://localhost/TEST --hive-import  --table pet  -m 1 --create-hive-table default.pet --username root --password cloudera</command>
<archive>/user/cloudera/examples/apps/map-reduce/lib/mysql-connector-java-5.1.15-bin.jar</archive>
    </sqoop>
<ok to="end"/>
<error to="fail"/>
</action>
 <kill name="fail">
        <message>Map/Reduce failed, error message[${wf:errorMessage(wf:lastErrorNode())}]</message>
    </kill>
    <end name="end"/>
</workflow-app>

Alternatively, I tried to put the sqoop command in a shell script and run it, it fails. If I write a map-reduce or echo commands in shell script, ozzie runs the job sucessfully but fails for sqoop commands with [org.apache.oozie.action.hadoop.ShellMain], exit code [1] error.

job.properties

nameNode=hdfs://quickstart.cloudera:8020
jobTracker=quickstart.cloudera:8032
queueName=default
oozie.libpath=${nameNode}/user/oozie/share/lib
oozie.use.system.libpath=true
oozie.wf.rerun.failnodes=true
user.name=cloudera
oozie.wf.application.path=${nameNode}/user/cloudera/examples/shellscript/workflow.xml 
outputDir=SqoopShellOutput

workflow.xml

<workflow-app name="script_oozie_job" xmlns="uri:oozie:workflow:0.3">
<start to='Test' />
<action name="Test">
<shell xmlns="uri:oozie:shell-action:0.1">
<job-tracker>${jobTracker}</job-tracker>
<name-node>${nameNode}</name-node>
<exec>sqoopcmd.sh</exec>
<file>hdfs://quickstart.cloudera:8020/user/cloudera/examples/shellscript/sqoopcmd.sh#sqoopcmd.sh</file>
</shell>
<ok to="end"/>
<error to="fail"/>
</action>
<kill name="fail">
<message>Script failed</message>
</kill>
<end name='end' />
</workflow-app>

sqoopcmd.sh

#!/bin/bash
# sqoop script
sqoop import-all-tables \
    -m 1 \
    --connect jdbc:mysql://localhost/TEST \
    --username=root \
    --password=cloudera \
    --warehouse-dir=/user/hive/warehouse \
    --hive-import 

No dignostics regarding any jobs in YARN Resource Manager.

Moreover, I have tried it in both the versions of cloudera, 5.7 and 5.5 but no luck.

Please suggest how can I resolve this issue.

7 REPLIES 7

Re: ​Every time I run sqoop command using oozie, I get Main class [org.apache.oozie.action.hadoop.SqoopMain], exit code [1] error.

Guru

I don't think the Oozie jar you've downloaded is accurate: you're talking about v2.3.2 while Sqoop is actually v4.2 (whereas HDP or CDH)

You may considering using the correct version and make it works in CLI first, executing in Oozie adds a complexity layer which could hide the real issues.

Re: ​Every time I run sqoop command using oozie, I get Main class [org.apache.oozie.action.hadoop.SqoopMain], exit code [1] error.

New Contributor

@Laurent Edel l have been using the oozie-core jar which came with cloudera VM prior to trying this one. But still it didn't work

Re: ​Every time I run sqoop command using oozie, I get Main class [org.apache.oozie.action.hadoop.SqoopMain], exit code [1] error.

Rising Star

@Pallavi Karan Does your <Sqoop Install Path>/lib directory has corresponding jdbc jar file. here mysql jdbc connector jar.

Could you able to run normal sqoop import commands? If this works then

Add the <mysql-connector-java.jar > library into the lib Path of oozie project root directory(It is the same path where the job.properties and workflow.xml files exist)

And try to add mysql-connector-*.jar once to share/lib/sqoop directory (HDFS Path).

Also in your sqoopcmd.sh add this param "--connection-manager org.apache.sqoop.manager.MySQLManager" in sqoop command.

Re: ​Every time I run sqoop command using oozie, I get Main class [org.apache.oozie.action.hadoop.SqoopMain], exit code [1] error.

New Contributor
@Dileep Kumar Chiguruvada

The sqoop import and export command runs fine on CLI.

I did add 'mysql connector' jar under lib folder in my oozie project but no luck.

I have also put 'mysql connector' jar under /user/oozie/share/lib/sqoop and /user/oozie/share/lib/lib_20151118030154/sqoop and /user/oozie/share/lib and /user/oozie/share/lib/sqoop/lib in HDFS path. Still no luck.

I tried adding "--connection-manager org.apache.sqoop.manager.MySQLManager" in sqoop command still not working.

<workflow-app xmlns="uri:oozie:workflow:0.2" name="map-reduce-wf"> <start to="mr-node"/>

<action name="sqoopAction"> <sqoop xmlns="uri:oozie:sqoop-action:0.2"> <job-tracker>${jobTracker}</job-tracker> <name-node>${nameNode}</name-node> <command>export --connect jdbc:mysql://localhost/TEST --hive-import --table pet -m 1 --create-hive-table default.pet --username root --password cloudera --connection-manage org.apache.sqoop.manager.MySQLManager</command> <archive>/user/cloudera/examples/apps/map-reduce/lib/mysql-connector-java-5.1.15-bin.jar</archive> <archive>/user/cloudera/examples/apps/map-reduce/lib/oozie-core-2.3.2-cdh3u3.jar</archive> </sqoop> <ok to="end"/> <error to="fail"/> </action> <kill name="fail"> <message>Map/Reduce failed, error message[${wf:errorMessage(wf:lastErrorNode())}]</message> </kill> <end name="end"/> </workflow-app>

Highlighted

Re: ​Every time I run sqoop command using oozie, I get Main class [org.apache.oozie.action.hadoop.SqoopMain], exit code [1] error.

@Pallavi Karan what is your cluster configuration? And where is sqoop installed?

Re: ​Every time I run sqoop command using oozie, I get Main class [org.apache.oozie.action.hadoop.SqoopMain], exit code [1] error.

New Contributor

This has not been solved yet? I have a similar problem - instead of import, it's a Sqoop plugin that I've coded.

Re: ​Every time I run sqoop command using oozie, I get Main class [org.apache.oozie.action.hadoop.SqoopMain], exit code [1] error.

Expert Contributor

We are also facing this issue, sqoop runs from command line, but not from oozie job/hue workflow