Support Questions
Find answers, ask questions, and share your expertise

error in oozie Class org.apache.oozie.action.hadoop.SqoopMain not found

Expert Contributor

workflow.xml:

<workflow-app name="once-a-day" xmlns="uri:oozie:workflow:0.1">
<start to="sqoopAction"/>
        <action name="sqoopAction">
                <sqoop xmlns="uri:oozie:sqoop-action:0.2">
                        <job-tracker>${jobTracker}</job-tracker>
                        <name-node>${nameNode}</name-node>
                    <command>import-all-tables --connect jdbc:mysql://xyz.syz/erp --username hive --password hive
                        --export-dir /user/hive/warehouse/sbl
                    </command>
                </sqoop>
                <ok to="end"/>
                <error to="killJob"/>
        </action>
<kill name="killJob">
            <message>"Killed job due to error: ${wf:errorMessage(wf:lastErrorNode())}"</message>
        </kill>
<end name="end" />
</workflow-app>

job.properties:

nameNode=hdfs://syz.syz.com:8020
jobTracker=xyz.syz.com:8050
queueName=default
oozie.use.system.libpath=true
oozie.coord.application.path=${nameNode}/user/${user.name}/scheduledimport
start=2013-09-01T00:00Z
end=2013-12-31T00:00Z
workflowAppUri=${nameNode}/user/${user.name}/scheduledimport

I get an error on sqoop task:

java.lang.ClassNotFoundException: Class org.apache.oozie.action.hadoop.SqoopMain not found

I have share/lib inside /user/oozie.

How do I fix this?

1 ACCEPTED SOLUTION

@simran kaur

Please list the content of your shared lib:

hdfs dfs -ls /user/oozie/share/lib/lib_{20160430195631}

hdfs dfs -ls -R /user/oozie/share/lib/lib_{20160430195631}/oozie

I can check with my installation if there is any missing jar.

View solution in original post

19 REPLIES 19

Rising Star

Does your oozie shared lib in hdfs have sqoop related dependencies?

If not set up sharedlib following this link

Expert Contributor

Yes, I sure do. What else could possibly be the problem?

Contributor

Hi @simran kaur,

You can list the available sharelibs with the following command:

sudo -u oozie oozie admin -shareliblist -oozie http://<oozie_host>:11000/oozie
[Available ShareLib]
oozie
hive
distcp
hcatalog
sqoop
mapreduce-streaming
spark
hive2
pig

Also, is the mysql driver in your lib folder of the workflow application as described here? https://oozie.apache.org/docs/4.2.0/WorkflowFunctionalSpec.html#a7_Workflow_Application_Deployment

Hope this helps, Chris

Expert Contributor

This is what I have in my sharelib:

-shareliblist [Available ShareLib]

hive 
mapreduce-streaming 
oozie 
sqoop 
pig

Still I keep getting the same error. Yes, I do have the lib folder along with the connector jar

Expert Contributor

Contributor

Does this work?

sqoop list-databases --connect jdbc:mysql://xyz.syz/ --username hive --password hive

Expert Contributor

@Christian Guegi : No, I still get the same error. Not even list databases or table works.

Expert Contributor

@Christian Guegi: I could run import and list commands from the command line directly thought.Somehow oozie is not able to pick up the share/lib. Also, is sharelib different from share/lib? I have share/lib in /user/oozie but nothing called sharelib.

Expert Contributor

@Christian Guegi: Now, I did a fresh install and can run list-databases and list-tables command but not import. I do have jdbc driver inside of lib folder in same directory as that of workflow.xml but do I need to reference to this jar inside lib folder from any of my files?

Contributor

@simram : What HDP version are you using? Is the Oozie service check in Ambari successfull?

Expert Contributor
@Christian Guegi

: HDP 2.4. Yes, service check in ambari is successful.

Contributor

Hi,

ShareLib concept is well described here

Below an example that works with HDP 2.2.4

<workflow-app name="jar-test" xmlns="uri:oozie:workflow:0.4">
    <start to="db-import"/>
    <action name="db-import">
        <sqoop xmlns="uri:oozie:sqoop-action:0.2">
            <job-tracker>${jobTracker}</job-tracker>
            <name-node>${nameNode}</name-node>
            <command>list-tables --connect jdbc:mysql://<db-host>/hive --username hive --password hive</command>
            <archive>/user/<username>/wf-test/lib/mysql-connector-java.jar</archive>
         </sqoop>
        <ok to="end"/>
        <error to="kill"/>
    </action>
    <kill name="kill">
        <message>Action failed, error message[${wf:errorMessage(wf:lastErrorNode())}]</message>
    </kill>
    <end name="end"/>
</workflow-app>

Hope it helps, Chris

Expert Contributor
@Christian Guegi

: Still nothing .It's the very same error and it's not able to execute sqoop command. what is name of the jar file which has sqoopMain class?

New Contributor

@simran kaur

Please list the content of your shared lib:

hdfs dfs -ls /user/oozie/share/lib/lib_{20160430195631}

hdfs dfs -ls -R /user/oozie/share/lib/lib_{20160430195631}/oozie

I can check with my installation if there is any missing jar.

@simran kaur

Please do the following and let me know if it works:

1. Copy /usr/share/java/mysql-connector-java.jar to /user/oozie/share/lib/lib_{20160430195631}/sqoop

2. Restart all oozie components.

4. Change --export-dir to --warehouse-dir in the workflow

3. Check the permission on /user/hive/warehouse/sbl. If possible point it to /tmp for testing.

Rerun the oozie workflow.

Please let us know if this works.

Thanks

Expert Contributor
@simran kaur, There are couple of ways to address this. One as @rbiswas mentioned and the other one is to create a lib folder where the workflow is receding in HDFS and place the mysql connector jar file there.

For example, if your workflow is located in hdfs at /user/simran/sqoopwf/workflow.xml. Take the path "/user/simran/sqoopwf/" and create a lib folder here hdfs dfs -mkdir -p /user/simran/sqoopwf/lib and then place the mysql connector jar to this location hdfs dfs -put <mysql-connector-java-version.jar> /user/simran/sqoopwf/lib/.

Then kick off the oozie job which should work.

Rising Star

@simran kaur

can you please let me know if your issue is resolved?

i am facing the same issue in oozie workflow/

thanks,

Rishit

New Contributor

@simran kaur Is the issue is solved for you? I am getting the same error (java.lang.ClassNotFoundException: Class org.apache.oozie.action.hadoop.SqoopMain not found)

Can somebody please help.

; ;