Support Questions

Find answers, ask questions, and share your expertise

[Solved] : Problem while submitting Oozie workflow for SqoopAction

avatar
Contributor

I am getting error while execution of Oozie workflow for Sqoop. I can execute list-database for list all database in MsSQL and also Sqoop command working well when I am executing without Workflow. Now I am trying to import table from MsSQL database into Hive using Sqoop, but while trying to import tables getting below error. Please help to resolve this.

Main class [org.apache.oozie.action.hadoop.SqoopMain], exit code [1]

Sqoop Command:

sqoop import --connect jdbc:sqlserver://<IP:PORT> --username <username> --password <password> --table <SQL_Tbale> --hive-import --hive-table <hivedb.hivetbl> -m 1

Following files are being used for workflow.

job.properties

nameNode=hdfs://sandbox.hortonworks.com:8020
jobTracker=sandbox.hortonworks.com:8050
queueName=default
appPath=${nameNode}/<HDFS_path_where_workflow.xml_file>
oozie.use.system.libpath=true
oozie.libpath=${nameNode}/user/oozie/share/lib/lib_20161025075203/
oozie.wf.application.path=${appPath}

workflow.xml

<workflow-app name="SqoopImportAction" xmlns="uri:oozie:workflow:0.4">
    <start to="sqoop-node"/>
    <action name="sqoop-node">
        <sqoop xmlns="uri:oozie:sqoop-action:0.2">
            <job-tracker>${jobTracker}</job-tracker>
            <name-node>${nameNode}</name-node>
            <job-xml>${appPath}/hive-site.xml</job-xml>
            <command>import --connect jdbc:sqlserver://<IP:PORT> --username <username> --password <password> --table <SQL_Tbale> --hive-import --hive-table <hivedb.hivetbl> -m 1 </command>
                <archive>${appPath}/sqljdbc42.jar</archive>
        </sqoop>
        <ok to="end"/>
        <error to="kill"/>
    </action>
    <kill name="kill">
        <message>Action failed, error message[${wf:errorMessage(wf:lastErrorNode())}]</message>
    </kill>
    <end name="end"/>
</workflow-app>

P.S.: I have also tried following things;

  1. Copy sqljdbc42.jar into appPath.
  2. Copy sqljdbc42.jar into oozie lib path. i.e ${nameNode}/user/oozie/share/lib/lib_20161025075203/sqoop
  3. Copy hive-exec*.jar intp oozie lib path.
  4. Run workflow as hdfs,oozie and root user

Please help me to resolve this. Thanks in Advance.

1 ACCEPTED SOLUTION

avatar
Contributor

HI All,
I have solved the issue by following ;

  1. Copied sqljdbc42.jar into Hive and HCatalog lib path on HDFS
  2. By Appending job.properties file by giving value of oozie.action.sharelib.for.sqoop and oozie.action.sharelib.for.hive arguments.

job.properties:

nameNode=hdfs://sandbox.hortonworks.com:8020
jobTracker=sandbox.hortonworks.com:8050
queueName=defaultappPath=${nameNode}/<HDFS_path_where_workflow.xml_file>
oozie.use.system.libpath=true
oozie.libpath=${nameNode}/user/oozie/share/lib/lib_20161025075203/
oozie.wf.application.path=${appPath}
#SHARELIB PATH FOR ACTION#
oozie.action.sharelib.for.sqoop=hive,hcatalog,sqoop 
oozie.action.sharelib.for.hive=hive,hcatalog,sqoop

Note:

You could exclude oozie.libpath from job.properties.

Regards,

Jay.

View solution in original post

2 REPLIES 2

avatar
Expert Contributor

Can you check logs of mapper task under oozie launcher job?

Also if you can attach yarn logs.

yarn logs -applicationId <applicationid>

avatar
Contributor

HI All,
I have solved the issue by following ;

  1. Copied sqljdbc42.jar into Hive and HCatalog lib path on HDFS
  2. By Appending job.properties file by giving value of oozie.action.sharelib.for.sqoop and oozie.action.sharelib.for.hive arguments.

job.properties:

nameNode=hdfs://sandbox.hortonworks.com:8020
jobTracker=sandbox.hortonworks.com:8050
queueName=defaultappPath=${nameNode}/<HDFS_path_where_workflow.xml_file>
oozie.use.system.libpath=true
oozie.libpath=${nameNode}/user/oozie/share/lib/lib_20161025075203/
oozie.wf.application.path=${appPath}
#SHARELIB PATH FOR ACTION#
oozie.action.sharelib.for.sqoop=hive,hcatalog,sqoop 
oozie.action.sharelib.for.hive=hive,hcatalog,sqoop

Note:

You could exclude oozie.libpath from job.properties.

Regards,

Jay.