hi guys, I am running an oozie workflow which calls sqoop import from sql-server to hive it works ok up to load data in hdfs but fails at hive part with error "[main] ERROR org.apache.sqoop.tool.ImportTool - Encountered IOException running import job: java.io.IOException: Hive exited with status 1". I have tried many recommendation from blog but non is working. Here are the files: 1.) job.properties nameNode=hdfs://xxx-hdfs.abc.net:8020 jobTracker=xxx-node-01.abc.net:8050 queueName=default jobRoot=dev/oozie-jobs/ DATEF=2016-01-01 DATET=2016-01-01 oozie.use.system.libpath=true oozie.libpath=${nameNode}/user/oozie/share/lib/lib_20160202121031 oozie.wf.application.path=${nameNode}/user/${user.name}/${jobRoot} ---------------------------------------------- 2.) workflow.xml --> ${jobTracker} ${nameNode} mapred.job.queue.name ${queueName} import --connect "jdbc:sqlserver://xxx.net;database=C9" --username=myuser --password=xxx --table "test_export" --hive-import --hive-database "dev2" --hive-table "test_export" --warehouse-dir "/hive/warehouse/dev.db/" -m 1 #### Below mentioned three files are in HDFS /tmp folder. #### I have also copied same below files in workflow project folder on HDFS, *.jar in /lib and hive-site.xml at project root folder. #### I have also copied same below files on oozie share folder on HDFS /user/oozie/share/lib/lib_20160202121031/oozie/, ./hive, ./sqoop folder. /tmp/hive-site.xml#hive-site.xml /tmp/sqljdbc4.jar#sqljdbc4.jar /tmp/libthrift-0.9.2.jar#libthrift-0.9.2.jar Sqoop failed, error message[${wf:errorMessage(wf:lastErrorNode())}] ---------------------------------------- 3.) as a comment. - Below mentioned three files are in HDFS /tmp folder. - I have also copied same below files in workflow project folder on HDFS, *.jar in /lib and hive-site.xml at project root folder. - I have also copied same below files on oozie share folder on HDFS /user/oozie/share/lib/lib_20160202121031/oozie/, ./hive, ./sqoop folder. /tmp/hive-site.xml /tmp/sqljdbc4.jar /tmp/libthrift-0.9.2.jar 4.) I could only find out version difference in libthrift.jar in HBASE, for that reason I have included right jar version in workflow.xml and lib folders. # find ./ -name 'libthrift*.jar' -print ./var/lib/ambari-server/resources/views/work/HIVE{1.0.0}/WEB-INF/lib/libthrift-0.9.0.jar ./tmp/share/lib/pig/libthrift-0.9.2.jar ./tmp/share/lib/hive2/libthrift-0.9.2.jar ./tmp/share/lib/hive/libthrift-0.9.2.jar ***** ./usr/lib/ams-hbase/lib/libthrift-0.9.0.jar ******* ./usr/hdp/2.3.4.0-3485/falcon/webapp/falcon/WEB-INF/lib/libthrift-0.9.2.jar ./usr/hdp/2.3.4.0-3485/falcon/client/lib/libthrift-0.9.2.jar **** ./usr/hdp/2.3.4.0-3485/hbase/lib/libthrift-0.9.0.jar ***** ./usr/hdp/2.3.4.0-3485/oozie/share/lib/pig/libthrift-0.9.2.jar ./usr/hdp/2.3.4.0-3485/oozie/share/lib/hive2/libthrift-0.9.2.jar ./usr/hdp/2.3.4.0-3485/oozie/share/lib/hive/libthrift-0.9.2.jar ./usr/hdp/2.3.4.0-3485/oozie/libserver/libthrift-0.9.2.jar ./usr/hdp/2.3.4.0-3485/oozie/oozie-server/webapps/oozie/WEB-INF/lib/libthrift-0.9.2.jar ./usr/hdp/2.3.4.0-3485/oozie/libtools/libthrift-0.9.2.jar ./usr/hdp/2.3.4.0-3485/hive/lib/libthrift-0.9.2.jar -------------------------------------------------------------------------------- 5.) Here is the main part of log file: yarn.resourcemanager.webapp.https.address=xxx-hdp-01.abc.net:8090 ------------------------ Sqoop command arguments : import --connect jdbc:sqlserver://xxx-00.abc.net;database=C9 --username=myuser --password=***** --table test_export --hive-import --hive-database development2 --hive-table test_export --warehouse-dir /hive/warehouse/dev2.db/ -m 1 Fetching child yarn jobs tag id : oozie-4d35cf7fa1bc973227771c93b6464c6e 2016-11-09 15:43:50,925 INFO [main] client.RMProxy (RMProxy.java:createRMProxy(98)) - Connecting to ResourceManager at xxx-hdp-01.abc.net/xx.x.xx.xx:8050 Child yarn jobs are found - 31705 [Thread-28] INFO org.apache.sqoop.hive.HiveImport - For more detailed output, check application tracking page:http://xxx-hdp-01.abc.net:8088/cluster/app/application_1474464025693_0554Then, click on links to logs of each attempt. 2016-11-09 15:44:20,859 INFO [Thread-28] hive.HiveImport (LoggingAsyncSink.java:run(85)) - For more detailed output, check application tracking page:http://xxx-hdp-01.abc.net:8088/cluster/app/application_1474464025693_0554Then, click on links to logs of each attempt. 31705 [Thread-28] INFO org.apache.sqoop.hive.HiveImport - Diagnostics: File does not exist: hdfs://xxx-hdfs.abc.net:8020/user/yarn/.hiveJars/hive-exec-1.2.1.2.3.4.0-3485-bb59749376792da886f093283cc8bbdb78c69612f13abcbcedbef00717030c90.jar 2016-11-09 15:44:20,859 INFO [Thread-28] hive.HiveImport (LoggingAsyncSink.java:run(85)) - Diagnostics: File does not exist: hdfs://xxx-hdfs.abc.net:8020/user/yarn/.hiveJars/hive-exec-1.2.1.2.3.4.0-3485-bb59749376792da886f093283cc8bbdb78c69612f13abcbcedbef00717030c90.jar 31705 [Thread-28] INFO org.apache.sqoop.hive.HiveImport - java.io.FileNotFoundException: File does not exist: hdfs://xxx-hdfs.abc.net:8020/user/yarn/.hiveJars/hive-exec-1.2.1.2.3.4.0-3485-bb59749376792da886f093283cc8bbdb78c69612f13abcbcedbef00717030c90.jar 2016-11-09 15:44:20,859 INFO [Thread-28] hive.HiveImport (LoggingAsyncSink.java:run(85)) - java.io.FileNotFoundException: File does not exist: hdfs://xxx-hdfs.abc.net:8020/user/yarn/.hiveJars/hive-exec-1.2.1.2.3.4.0-3485-bb59749376792da886f093283cc8bbdb78c69612f13abcbcedbef00717030c90.jar 31705 [Thread-28] INFO org.apache.sqoop.hive.HiveImport - at org.apache.hadoop.hdfs.DistributedFileSystem$22.doCall(DistributedFileSystem.java:1319) >>> Invoking Sqoop command line now >>> 1939 [main] WARN org.apache.sqoop.tool.SqoopTool - $SQOOP_CONF_DIR has not been set in the environment. Cannot check for additional configuration. 2016-11-09 15:43:51,093 WARN [main] tool.SqoopTool (SqoopTool.java:loadPluginsFromConfDir(177)) - $SQOOP_CONF_DIR has not been set in the environment. Cannot check for additional configuration. 1960 [main] INFO org.apache.sqoop.Sqoop - Running Sqoop version: 1.4.6.2.3.4.0-3485 2016-11-09 15:43:56,978 INFO [main] mapreduce.Job (Job.java:submit(1294)) - The url to track the job: http://xxx-hdp-01.abc.net:8088/proxy/application_1474464025693_0553/ 2016-11-09 15:43:56,979 INFO [main] mapreduce.Job (Job.java:monitorAndPrintJob(1339)) - Running job: job_1474464025693_0553 2016-11-09 15:44:04,065 INFO [main] mapreduce.Job (Job.java:monitorAndPrintJob(1360)) - Job job_1474464025693_0553 running in uber mode : false 2016-11-09 15:44:04,066 INFO [main] mapreduce.Job (Job.java:monitorAndPrintJob(1367)) - map 0% reduce 0% 2016-11-09 15:44:12,448 INFO [main] mapreduce.Job (Job.java:monitorAndPrintJob(1367)) - map 100% reduce 0% 2016-11-09 15:44:12,455 INFO [main] mapreduce.Job (Job.java:monitorAndPrintJob(1378)) - Job job_1474464025693_0553 completed successfully 2016-11-09 15:44:12,508 INFO [main] mapreduce.Job (Job.java:monitorAndPrintJob(1385)) - Counters: 30 File System Counters FILE: Number of bytes read=0 FILE: Number of bytes written=310534 FILE: Number of read operations=0 FILE: Number of large read operations=0 FILE: Number of write operations=0 HDFS: Number of bytes read=87 HDFS: Number of bytes written=42 HDFS: Number of read operations=4 HDFS: Number of large read operations=0 HDFS: Number of write operations=2 Job Counters Launched map tasks=1 Other local map tasks=1 Total time spent by all maps in occupied slots (ms)=3999 Total time spent by all reduces in occupied slots (ms)=0 Total time spent by all map tasks (ms)=3999 Total vcore-seconds taken by all map tasks=3999 Total megabyte-seconds taken by all map tasks=10237440 Map-Reduce Framework Map input records=3 Map output records=3 Input split bytes=87 Spilled Records=0 Failed Shuffles=0 Merged Map outputs=0 GC time elapsed (ms)=47 CPU time spent (ms)=1750 Physical memory (bytes) snapshot=217124864 Virtual memory (bytes) snapshot=3938050048 Total committed heap usage (bytes)=225443840 File Input Format Counters Bytes Read=0 File Output Format Counters Bytes Written=42 23359 [main] INFO org.apache.sqoop.mapreduce.ImportJobBase - Transferred 42 bytes in 19.2702 seconds (2.1795 bytes/sec) 2016-11-09 15:44:12,513 INFO [main] mapreduce.ImportJobBase (ImportJobBase.java:runJob(184)) - Transferred 42 bytes in 19.2702 seconds (2.1795 bytes/sec) 23363 [main] INFO org.apache.sqoop.mapreduce.ImportJobBase - Retrieved 3 records. 2016-11-09 15:44:12,517 INFO [main] mapreduce.ImportJobBase (ImportJobBase.java:runJob(186)) - Retrieved 3 records. 23377 [main] INFO org.apache.sqoop.manager.SqlManager - Executing SQL statement: SELECT t.* FROM [test_export] AS t WHERE 1=0 2016-11-09 15:44:12,531 INFO [main] manager.SqlManager (SqlManager.java:execute(757)) - Executing SQL statement: SELECT t.* FROM [test_export] AS t WHERE 1=0 23393 [main] INFO org.apache.sqoop.hive.HiveImport - Loading uploaded data into Hive 2016-11-09 15:44:12,547 INFO [main] hive.HiveImport (HiveImport.java:importTable(195)) - Loading uploaded data into Hive 25152 [Thread-28] INFO org.apache.sqoop.hive.HiveImport - WARNING: Use "yarn jar" to launch YARN applications. 26147 [Thread-28] INFO org.apache.sqoop.hive.HiveImport - 2016-11-09 15:44:15,301 INFO [Thread-28] hive.HiveImport (LoggingAsyncSink.java:run(85)) - 26148 [Thread-28] INFO org.apache.sqoop.hive.HiveImport - Logging initialized using configuration in file:/etc/hive/2.3.4.0-3485/0/hive-log4j.properties 2016-11-09 15:44:15,302 INFO [Thread-28] hive.HiveImport (LoggingAsyncSink.java:run(85)) - Logging initialized using configuration in file:/etc/hive/2.3.4.0-3485/0/hive-log4j.properties Heart beat 31703 [Thread-28] INFO org.apache.sqoop.hive.HiveImport - Exception in thread "main" java.lang.RuntimeException: org.apache.tez.dag.api.SessionNotRunning: TezSession has already shutdown. Application application_1474464025693_0554 failed 2 times due to AM Container for appattempt_1474464025693_0554_000002 exited with exitCode: -1000 2016-11-09 15:44:20,857 INFO [Thread-28] hive.HiveImport (LoggingAsyncSink.java:run(85)) - Exception in thread "main" java.lang.RuntimeException: org.apache.tez.dag.api.SessionNotRunning: TezSession has already shutdown. Application application_1474464025693_0554 failed 2 times due to AM Container for appattempt_1474464025693_0554_000002 exited with exitCode: -1000 2016-11-09 15:44:20,857 INFO [Thread-28] hive.HiveImport (LoggingAsyncSink.java:run(85)) - Diagnostics: File does not exist: hdfs://xxx-hdfs.abc.net:8020/user/yarn/.hiveJars/hive-exec-1.2.1.2.3.4.0-3485-bb59749376792da886f093283cc8bbdb78c69612f13abcbcedbef00717030c90.jar ******** I have checked above file exist ******* 31703 [Thread-28] INFO org.apache.sqoop.hive.HiveImport - java.io.FileNotFoundException: File does not exist: hdfs://xxxx-hdfs.abc.net:8020/user/yarn/.hiveJars/hive-exec-1.2.1.2.3.4.0-3485-bb59749376792da886f093283cc8bbdb78c69612f13abcbcedbef00717030c90.jar 2016-11-09 15:44:20,857 INFO [Thread-28] hive.HiveImport (LoggingAsyncSink.java:run(85)) - java.io.FileNotFoundException: File does not exist: hdfs://xxx-hdfs.abc.net:8020/user/yarn/.hiveJars/hive-exec-1.2.1.2.3.4.0-3485-bb59749376792da886f093283cc8bbdb78c69612f13abcbcedbef00717030c90.jar 31703 [Thread-28] INFO org.apache.sqoop.hive.HiveImport - at org.apache.hadoop.hdfs.DistributedFileSystem$22.doCall(DistributedFileSystem.java:1319) .. .. bunch of message 2016-11-09 15:44:20,860 INFO [Thread-28] hive.HiveImport (LoggingAsyncSink.java:run(85)) - at javax.security.auth.Subject.doAs(Subject.java:422) 31706 [Thread-28] INFO org.apache.sqoop.hive.HiveImport - at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657) 2016-11-09 15:44:20,860 INFO [Thread-28] hive.HiveImport (LoggingAsyncSink.java:run(85)) - at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657) 31706 [Thread-28] INFO org.apache.sqoop.hive.HiveImport - at org.apache.hadoop.yarn.util.FSDownload.call(FSDownload.java:358) 2016-11-09 15:44:20,860 INFO [Thread-28] hive.HiveImport (LoggingAsyncSink.java:run(85)) - at org.apache.hadoop.yarn.util.FSDownload.call(FSDownload.java:358) 31706 [Thread-28] INFO org.apache.sqoop.hive.HiveImport - at org.apache.hadoop.yarn.util.FSDownload.call(FSDownload.java:62) 2016-11-09 15:44:20,860 INFO [Thread-28] hive.HiveImport (LoggingAsyncSink.java:run(85)) - at org.apache.hadoop.yarn.util.FSDownload.call(FSDownload.java:62) 31706 [Thread-28] INFO org.apache.sqoop.hive.HiveImport - at java.util.concurrent.FutureTask.run(FutureTask.java:266) 2016-11-09 15:44:20,860 INFO [Thread-28] hive.HiveImport (LoggingAsyncSink.java:run(85)) - at java.util.concurrent.FutureTask.run(FutureTask.java:266) 31706 [Thread-28] INFO org.apache.sqoop.hive.HiveImport - at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) 2016-11-09 15:44:20,860 INFO [Thread-28] hive.HiveImport (LoggingAsyncSink.java:run(85)) - at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) 31706 [Thread-28] INFO org.apache.sqoop.hive.HiveImport - at java.util.concurrent.FutureTask.run(FutureTask.java:266) 2016-11-09 15:44:20,860 INFO [Thread-28] hive.HiveImport (LoggingAsyncSink.java:run(85)) - at java.util.concurrent.FutureTask.run(FutureTask.java:266) 31706 [Thread-28] INFO org.apache.sqoop.hive.HiveImport - at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) 2016-11-09 15:44:20,860 INFO [Thread-28] hive.HiveImport (LoggingAsyncSink.java:run(85)) - at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) 31706 [Thread-28] INFO org.apache.sqoop.hive.HiveImport - at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) 2016-11-09 15:44:20,860 INFO [Thread-28] hive.HiveImport (LoggingAsyncSink.java:run(85)) - at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) 31706 [Thread-28] INFO org.apache.sqoop.hive.HiveImport - at java.lang.Thread.run(Thread.java:745) 2016-11-09 15:44:20,860 INFO [Thread-28] hive.HiveImport (LoggingAsyncSink.java:run(85)) - at java.lang.Thread.run(Thread.java:745) 31706 [Thread-28] INFO org.apache.sqoop.hive.HiveImport - 2016-11-09 15:44:20,860 INFO [Thread-28] hive.HiveImport (LoggingAsyncSink.java:run(85)) - 31706 [Thread-28] INFO org.apache.sqoop.hive.HiveImport - Failing this attempt. Failing the application. 2016-11-09 15:44:20,860 INFO [Thread-28] hive.HiveImport (LoggingAsyncSink.java:run(85)) - Failing this attempt. Failing the application. 31706 [Thread-28] INFO org.apache.sqoop.hive.HiveImport - at org.apache.tez.client.TezClient.waitTillReady(TezClient.java:726) 2016-11-09 15:44:20,860 INFO [Thread-28] hive.HiveImport (LoggingAsyncSink.java:run(85)) - at org.apache.tez.client.TezClient.waitTillReady(TezClient.java:726) 31706 [Thread-28] INFO org.apache.sqoop.hive.HiveImport - at org.apache.hadoop.hive.ql.exec.tez.TezSessionState.open(TezSessionState.java:217) 2016-11-09 15:44:20,860 INFO [Thread-28] hive.HiveImport (LoggingAsyncSink.java:run(85)) - at org.apache.hadoop.hive.ql.exec.tez.TezSessionState.open(TezSessionState.java:217) 31706 [Thread-28] INFO org.apache.sqoop.hive.HiveImport - at org.apache.hadoop.hive.ql.exec.tez.TezSessionState.open(TezSessionState.java:117) 2016-11-09 15:44:20,860 INFO [Thread-28] hive.HiveImport (LoggingAsyncSink.java:run(85)) - at org.apache.hadoop.hive.ql.exec.tez.TezSessionState.open(TezSessionState.java:117) 31706 [Thread-28] INFO org.apache.sqoop.hive.HiveImport - at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:504) 2016-11-09 15:44:20,860 INFO [Thread-28] hive.HiveImport (LoggingAsyncSink.java:run(85)) - at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:504) 31706 [Thread-28] INFO org.apache.sqoop.hive.HiveImport - ... 8 more 2016-11-09 15:44:20,860 INFO [Thread-28] hive.HiveImport (LoggingAsyncSink.java:run(85)) - ... 8 more 32088 [main] ERROR org.apache.sqoop.tool.ImportTool - Encountered IOException running import job: java.io.IOException: Hive exited with status 1 at org.apache.sqoop.hive.HiveImport.executeExternalHiveScript(HiveImport.java:394) at org.apache.sqoop.hive.HiveImport.executeScript(HiveImport.java:344) at org.apache.sqoop.hive.HiveImport.importTable(HiveImport.java:245) at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:514) at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:605) at org.apache.sqoop.Sqoop.run(Sqoop.java:148) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76) at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:184) at org.apache.sqoop.Sqoop.runTool(Sqoop.java:226) at org.apache.sqoop.Sqoop.runTool(Sqoop.java:235) at org.apache.sqoop.Sqoop.main(Sqoop.java:244) at org.apache.oozie.action.hadoop.SqoopMain.runSqoopJob(SqoopMain.java:197) at org.apache.oozie.action.hadoop.SqoopMain.run(SqoopMain.java:177) at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:47) at org.apache.oozie.action.hadoop.SqoopMain.main(SqoopMain.java:46) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:497) at org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:241) at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54) at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:453) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162) 2016-11-09 15:44:21,242 ERROR [main] tool.ImportTool (ImportTool.java:run(613)) - Encountered IOException running import job: java.io.IOException: Hive exited with status 1 at org.apache.sqoop.hive.HiveImport.executeExternalHiveScript(HiveImport.java:394) at org.apache.sqoop.hive.HiveImport.executeScript(HiveImport.java:344) at org.apache.sqoop.hive.HiveImport.importTable(HiveImport.java:245) at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:514) at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:605) at org.apache.sqoop.Sqoop.run(Sqoop.java:148) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76) at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:184) at org.apache.sqoop.Sqoop.runTool(Sqoop.java:226) at org.apache.sqoop.Sqoop.runTool(Sqoop.java:235) at org.apache.sqoop.Sqoop.main(Sqoop.java:244) at org.apache.oozie.action.hadoop.SqoopMain.runSqoopJob(SqoopMain.java:197) at org.apache.oozie.action.hadoop.SqoopMain.run(SqoopMain.java:177) at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:47) at org.apache.oozie.action.hadoop.SqoopMain.main(SqoopMain.java:46) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:497) at org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:241) at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54) at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:453) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162) Intercepting System.exit(1) <<< Invocation of Main class completed <<< Failing Oozie Launcher, Main class [org.apache.oozie.action.hadoop.SqoopMain], exit code [1] Oozie Launcher failed, finishing Hadoop job gracefully Oozie Launcher, uploading action data to HDFS sequence file: hdfs://xxxx-hdfs.abc.net:8020/user/imtiaz.yousaf/oozie-oozi/0000061-160921102216774-oozie-oozi-W/sqoop2hive--sqoop/action-data.seq 2016-11-09 15:44:21,279 INFO [main] zlib.ZlibFactory (ZlibFactory.java:(49)) - Successfully loaded & initialized native-zlib library 2016-11-09 15:44:21,280 INFO [main] compress.CodecPool (CodecPool.java:getCompressor(153)) - Got brand-new compressor [.deflate] Oozie Launcher ends 2016-11-09 15:44:21,333 INFO [main] mapred.Task (Task.java:done(1038)) - Task:attempt_1474464025693_0552_m_000000_0 is done. And is in the process of committing 2016-11-09 15:44:21,365 INFO [main] mapred.Task (Task.java:commit(1199)) - Task attempt_1474464025693_0552_m_000000_0 is allowed to commit now 2016-11-09 15:44:21,397 INFO [main] output.FileOutputCommitter (FileOutputCommitter.java:commitTask(582)) - Saved output of task 'attempt_1474464025693_0552_m_000000_0' to hdfs://xxx-hdfs.abc.net:8020/user/imtiaz.yousaf/oozie-oozi/0000061-160921102216774-oozie-oozi-W/sqoop2hive--sqoop/output/_temporary/1/task_1474464025693_0552_m_000000 2016-11-09 15:44:21,421 INFO [main] mapred.Task (Task.java:sendDone(1158)) - Task 'attempt_1474464025693_0552_m_000000_0' done.