Support Questions

Find answers, ask questions, and share your expertise
Check out our newest addition to the community, the Cloudera Data Analytics (CDA) group hub.

Sqoop export failed to export data into MsSQL


Hi All,

I want to export CSV data into MsSQL using Sqoop. I have created a table in MsSQL which have one auto-increment column named 'ID'. I have one CSV file in HDFS directory. I have executed below Sqoop export command;

Sqoop Command:

sqoop export --connect 'jdbc:sqlserver://;databasename=<mssql_database_name>'--username xxxx-passwordxxxx--export-dir /user/root/input/data.csv--table <mssql_table_name>

I am facing the following error;


18/07/11 10:30:48 INFO mapreduce.Job:  map 0% reduce 0%
18/07/11 10:31:12 INFO mapreduce.Job:  map 75% reduce 0%
18/07/11 10:31:13 INFO mapreduce.Job:  map 100% reduce 0%
18/07/11 10:31:17 INFO mapreduce.Job: Job job_1531283775339_0005 failed with state FAILED due to: Task failed task_1531283775339_0005_m_000003
Job failed as tasks failed. failedMaps:1 failedReduces:0
18/07/11 10:31:18 INFO mapreduce.Job: Counters: 31
        File System Counters
                FILE: Number of bytes read=0
                FILE: Number of bytes written=163061
                FILE: Number of read operations=0
                FILE: Number of large read operations=0
                FILE: Number of write operations=0
                HDFS: Number of bytes read=261
                HDFS: Number of bytes written=0
                HDFS: Number of read operations=7
                HDFS: Number of large read operations=0
                HDFS: Number of write operations=0
        Job Counters
                Failed map tasks=3
                Launched map tasks=4
                Data-local map tasks=4
                Total time spent by all maps in occupied slots (ms)=87423
                Total time spent by all reduces in occupied slots (ms)=0
                Total time spent by all map tasks (ms)=87423
                Total vcore-milliseconds taken by all map tasks=87423
                Total megabyte-milliseconds taken by all map tasks=21855750
        Map-Reduce Framework
                Map input records=0
                Map output records=0
                Input split bytes=240
                Spilled Records=0
                Failed Shuffles=0
                Merged Map outputs=0
                GC time elapsed (ms)=346
                CPU time spent (ms)=470
                Physical memory (bytes) snapshot=106921984
                Virtual memory (bytes) snapshot=1924980736
                Total committed heap usage (bytes)=39321600
        File Input Format Counters
                Bytes Read=0
        File Output Format Counters
                Bytes Written=0
18/07/11 10:31:18 INFO mapreduce.ExportJobBase: Transferred 261 bytes in 71.3537 seconds (3.6578 bytes/sec)
18/07/11 10:31:18 INFO mapreduce.ExportJobBase: Exported 0 records.
18/07/11 10:31:18 ERROR mapreduce.ExportJobBase: Export job failed!
18/07/11 10:31:18 ERROR tool.ExportTool: Error during export: Export job failed!

Sample Data:





Hello @JAy PaTel!
Guess you've to change your --export-dir /user/root/input/data.csv to --export-dir /user/root/input/

Hope this helps!


@Vinicius Higa Murakami

Thanks for the reply,

But as per my knowledge, it won't work.

Cause I have too many input files in the same folder, how could Sqoop identify which file user wants to export?


Take a Tour of the Community
Don't have an account?
Your experience may be limited. Sign in to explore more.