Support Questions

Find answers, ask questions, and share your expertise

import data into hive with sqoop

Rising Star
Hello,
I try to create a job with a command oozie Sqoop got this error:
Intercepting System.exit(1)
Failing Oozie Launcher, Main class [org.apache.oozie.action.hadoop.SqoopMain], exit code [1]

this is my xml file : 

<workflow-app name="exemple_hive" xmlns="uri:oozie:workflow:0.5">
  <global>
            <configuration>
                <property>
                    <name>mapreduce.job.queuename</name>
                    <value>DES</value>
                </property>
            </configuration>
  </global>
    <start to="sqoop-9fb3"/>
    <kill name="Kill">
        <message>L'action a échoué, message d'erreur[${wf:errorMessage(wf:lastErrorNode())}]</message>
    </kill>
    <action name="sqoop-9fb3">
        <sqoop xmlns="uri:oozie:sqoop-action:0.2">
            <job-tracker>${jobTracker}</job-tracker>
            <name-node>${nameNode}</name-node>
            <command>sqoop import -Dmapred.job.queue.name=DES --connect "jdbc:jtds:sqlserver://xxxx.xxxx.xxxx.xxxx:xxxx;databaseName=xxxxxxxx;user=xxxxxxxx;password=xxxxxxxx;instance=MSPAREBTP02" --driver net.sourceforge.jtds.jdbc.Driver --username hdp-import --table qvol_ccy --hive-import --hive-table test.qvol_ccy -m 1</command>
            <file>/dev/datalake/app/des/dev/lib/jtds-1.3.1.jar#jtds-1.3.1.jar</file>
            <file>/dev/datalake/app/des/dev/script/hive-site.xml#hive-site.xml</file>
        </sqoop>
        <ok to="End"/>
        <error to="Kill"/>
    </action>
    <end name="End"/>
</workflow-app>

1 ACCEPTED SOLUTION

You need to look into the logs. Most likely yarn logs of the Map Task of your Oozie launcher. This contains the sqoop command execution and any errors you would normally see on the command line. You can get them from resourcemanager ( click on your oozie launcher job and go through to the map task or use yarn application -logs.

You can find any issues in the actual data transfer in the kicked off Mapreduce job which is a separate job

View solution in original post

1 REPLY 1

You need to look into the logs. Most likely yarn logs of the Map Task of your Oozie launcher. This contains the sqoop command execution and any errors you would normally see on the command line. You can get them from resourcemanager ( click on your oozie launcher job and go through to the map task or use yarn application -logs.

You can find any issues in the actual data transfer in the kicked off Mapreduce job which is a separate job

Take a Tour of the Community
Don't have an account?
Your experience may be limited. Sign in to explore more.