Support Questions

Find answers, ask questions, and share your expertise

error in oozie Class org.apache.oozie.action.hadoop.SqoopMain not found

avatar
Expert Contributor

workflow.xml:

<workflow-app name="once-a-day" xmlns="uri:oozie:workflow:0.1">
<start to="sqoopAction"/>
        <action name="sqoopAction">
                <sqoop xmlns="uri:oozie:sqoop-action:0.2">
                        <job-tracker>${jobTracker}</job-tracker>
                        <name-node>${nameNode}</name-node>
                    <command>import-all-tables --connect jdbc:mysql://xyz.syz/erp --username hive --password hive
                        --export-dir /user/hive/warehouse/sbl
                    </command>
                </sqoop>
                <ok to="end"/>
                <error to="killJob"/>
        </action>
<kill name="killJob">
            <message>"Killed job due to error: ${wf:errorMessage(wf:lastErrorNode())}"</message>
        </kill>
<end name="end" />
</workflow-app>

job.properties:

nameNode=hdfs://syz.syz.com:8020
jobTracker=xyz.syz.com:8050
queueName=default
oozie.use.system.libpath=true
oozie.coord.application.path=${nameNode}/user/${user.name}/scheduledimport
start=2013-09-01T00:00Z
end=2013-12-31T00:00Z
workflowAppUri=${nameNode}/user/${user.name}/scheduledimport

I get an error on sqoop task:

java.lang.ClassNotFoundException: Class org.apache.oozie.action.hadoop.SqoopMain not found

I have share/lib inside /user/oozie.

How do I fix this?

1 ACCEPTED SOLUTION

avatar

@simran kaur

Please list the content of your shared lib:

hdfs dfs -ls /user/oozie/share/lib/lib_{20160430195631}

hdfs dfs -ls -R /user/oozie/share/lib/lib_{20160430195631}/oozie

I can check with my installation if there is any missing jar.

View solution in original post

19 REPLIES 19

avatar
Expert Contributor

Does your oozie shared lib in hdfs have sqoop related dependencies?

If not set up sharedlib following this link

avatar
Expert Contributor

Yes, I sure do. What else could possibly be the problem?

avatar
Rising Star

Hi @simran kaur,

You can list the available sharelibs with the following command:

sudo -u oozie oozie admin -shareliblist -oozie http://<oozie_host>:11000/oozie
[Available ShareLib]
oozie
hive
distcp
hcatalog
sqoop
mapreduce-streaming
spark
hive2
pig

Also, is the mysql driver in your lib folder of the workflow application as described here? https://oozie.apache.org/docs/4.2.0/WorkflowFunctionalSpec.html#a7_Workflow_Application_Deployment

Hope this helps, Chris

avatar
Expert Contributor

This is what I have in my sharelib:

-shareliblist [Available ShareLib]

hive 
mapreduce-streaming 
oozie 
sqoop 
pig

Still I keep getting the same error. Yes, I do have the lib folder along with the connector jar

avatar
Expert Contributor

avatar
Rising Star

Does this work?

sqoop list-databases --connect jdbc:mysql://xyz.syz/ --username hive --password hive

avatar
Expert Contributor

@Christian Guegi : No, I still get the same error. Not even list databases or table works.

avatar
Expert Contributor

@Christian Guegi: I could run import and list commands from the command line directly thought.Somehow oozie is not able to pick up the share/lib. Also, is sharelib different from share/lib? I have share/lib in /user/oozie but nothing called sharelib.

avatar
Expert Contributor

@Christian Guegi: Now, I did a fresh install and can run list-databases and list-tables command but not import. I do have jdbc driver inside of lib folder in same directory as that of workflow.xml but do I need to reference to this jar inside lib folder from any of my files?