Created 07-11-2018 10:59 AM
Hi All,
I want to export CSV data into MsSQL using Sqoop. I have created a table in MsSQL which have one auto-increment column named 'ID'. I have one CSV file in HDFS directory. I have executed below Sqoop export command;
Sqoop Command:
sqoop export --connect 'jdbc:sqlserver://xxx.xxx.xx.xx:xxxx;databasename=<mssql_database_name>'--username xxxx-passwordxxxx--export-dir /user/root/input/data.csv--table <mssql_table_name>
I am facing the following error;
Error:
18/07/11 10:30:48 INFO mapreduce.Job:  map 0% reduce 0%
18/07/11 10:31:12 INFO mapreduce.Job:  map 75% reduce 0%
18/07/11 10:31:13 INFO mapreduce.Job:  map 100% reduce 0%
18/07/11 10:31:17 INFO mapreduce.Job: Job job_1531283775339_0005 failed with state FAILED due to: Task failed task_1531283775339_0005_m_000003
Job failed as tasks failed. failedMaps:1 failedReduces:0
18/07/11 10:31:18 INFO mapreduce.Job: Counters: 31
        File System Counters
                FILE: Number of bytes read=0
                FILE: Number of bytes written=163061
                FILE: Number of read operations=0
                FILE: Number of large read operations=0
                FILE: Number of write operations=0
                HDFS: Number of bytes read=261
                HDFS: Number of bytes written=0
                HDFS: Number of read operations=7
                HDFS: Number of large read operations=0
                HDFS: Number of write operations=0
        Job Counters
                Failed map tasks=3
                Launched map tasks=4
                Data-local map tasks=4
                Total time spent by all maps in occupied slots (ms)=87423
                Total time spent by all reduces in occupied slots (ms)=0
                Total time spent by all map tasks (ms)=87423
                Total vcore-milliseconds taken by all map tasks=87423
                Total megabyte-milliseconds taken by all map tasks=21855750
        Map-Reduce Framework
                Map input records=0
                Map output records=0
                Input split bytes=240
                Spilled Records=0
                Failed Shuffles=0
                Merged Map outputs=0
                GC time elapsed (ms)=346
                CPU time spent (ms)=470
                Physical memory (bytes) snapshot=106921984
                Virtual memory (bytes) snapshot=1924980736
                Total committed heap usage (bytes)=39321600
        File Input Format Counters
                Bytes Read=0
        File Output Format Counters
                Bytes Written=0
18/07/11 10:31:18 INFO mapreduce.ExportJobBase: Transferred 261 bytes in 71.3537 seconds (3.6578 bytes/sec)
18/07/11 10:31:18 INFO mapreduce.ExportJobBase: Exported 0 records.
18/07/11 10:31:18 ERROR mapreduce.ExportJobBase: Export job failed!
18/07/11 10:31:18 ERROR tool.ExportTool: Error during export: Export job failed!Sample Data:
abc,1223 abck,1332 abckp,2113
Regards,
Jay.
Created 07-11-2018 09:15 PM
Hello @JAy PaTel! 
Guess you've to change your --export-dir /user/root/input/data.csv to --export-dir /user/root/input/
Hope this helps!
Created 07-12-2018 06:22 AM
Thanks for the reply,
But as per my knowledge, it won't work.
Cause I have too many input files in the same folder, how could Sqoop identify which file user wants to export?
Regards,
Jay.