Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

ERROR tool.ExportTool: Error during export: Export job failed!

avatar
New Contributor

Hi All,

 

I am trying to Export the data from HBase to Mysql using sqoop export command (below) when i run the below command i am getting "

ERROR tool.ExportTool: Error during export: 
Export job failed! e

error even the input and the data is in correct format.

 

can any anyone tell me what i need to do to run the Sqoop export.

 

below are the console logs

 

 

hdfs@node-2:/$ sqoop export --connect jdbc:mysql://kraptor/kraptor  --username root --password-file file:///var/lib/hadoop-hdfs/sqoop.password  --table Demo_blog --update-key id  --update-mode updateonly --export-dir /user/hdfs/demoblog.csv -m4  --lines-terminated-by '\n' --input-fields-terminated-by ',' --driver com.mysql.jdbc.Driver;
Warning: /opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/bin/../lib/sqoop/../accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
17/07/25 09:28:33 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6-cdh5.12.0
17/07/25 09:28:34 WARN sqoop.ConnFactory: Parameter --driver is set to an explicit driver however appropriate connection manager is not being set (via --connection-manager). Sqoop is going to fall back to org.apache.sqoop.manager.GenericJdbcManager. Please specify explicitly which connection manager should be used next time.
17/07/25 09:28:34 INFO manager.SqlManager: Using default fetchSize of 1000
17/07/25 09:28:34 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM Demo_blog AS t WHERE 1=0
17/07/25 09:28:34 INFO tool.CodeGenTool: Beginning code generation
17/07/25 09:28:34 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM Demo_blog AS t WHERE 1=0
17/07/25 09:28:34 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM Demo_blog AS t WHERE 1=0
17/07/25 09:28:34 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/bin/../lib/sqoop/../hadoop-mapreduce
Note: /tmp/sqoop-hdfs/compile/953360bccb8d21472993a7ad36ca8dac/Demo_blog.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
17/07/25 09:28:35 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-hdfs/compile/953360bccb8d21472993a7ad36ca8dac/Demo_blog.jar
17/07/25 09:28:35 INFO mapreduce.ExportJobBase: Beginning export of Demo_blog
17/07/25 09:28:36 INFO Configuration.deprecation: mapred.jar is deprecated. Instead, use mapreduce.job.jar
17/07/25 09:28:36 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM Demo_blog AS t WHERE 1=0
17/07/25 09:28:36 INFO Configuration.deprecation: mapred.reduce.tasks.speculative.execution is deprecated. Instead, use mapreduce.reduce.speculative
17/07/25 09:28:36 INFO Configuration.deprecation: mapred.map.tasks.speculative.execution is deprecated. Instead, use mapreduce.map.speculative
17/07/25 09:28:36 INFO Configuration.deprecation: mapred.map.tasks is deprecated. Instead, use mapreduce.job.maps
17/07/25 09:28:36 INFO client.RMProxy: Connecting to ResourceManager at node-2.c.k-raptor.internal/10.140.0.3:8032
17/07/25 09:28:38 INFO input.FileInputFormat: Total input paths to process : 1
17/07/25 09:28:38 INFO input.FileInputFormat: Total input paths to process : 1
17/07/25 09:28:38 INFO mapreduce.JobSubmitter: number of splits:4
17/07/25 09:28:38 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1500463014055_0245
17/07/25 09:28:39 INFO impl.YarnClientImpl: Submitted application application_1500463014055_0245
17/07/25 09:28:39 INFO mapreduce.Job: The url to track the job: http://node-2.c.k-raptor.internal:8088/proxy/application_1500463014055_0245/
17/07/25 09:28:39 INFO mapreduce.Job: Running job: job_1500463014055_0245
17/07/25 09:28:45 INFO mapreduce.Job: Job job_1500463014055_0245 running in uber mode : false
17/07/25 09:28:45 INFO mapreduce.Job:  map 0% reduce 0%
17/07/25 09:28:50 INFO mapreduce.Job:  map 75% reduce 0%
17/07/25 09:28:51 INFO mapreduce.Job:  map 100% reduce 0%
17/07/25 09:28:51 INFO mapreduce.Job: Job job_1500463014055_0245 failed with state FAILED due to: Task failed task_1500463014055_0245_m_000000
Job failed as tasks failed. failedMaps:1 failedReduces:0

17/07/25 09:28:51 INFO mapreduce.Job: Counters: 32
        File System Counters
                FILE: Number of bytes read=0
                FILE: Number of bytes written=460611
                FILE: Number of read operations=0
                FILE: Number of large read operations=0
                FILE: Number of write operations=0
                HDFS: Number of bytes read=723
                HDFS: Number of bytes written=0
                HDFS: Number of read operations=12
                HDFS: Number of large read operations=0
                HDFS: Number of write operations=0
        Job Counters 
                Failed map tasks=1
                Launched map tasks=4
                Data-local map tasks=1
                Rack-local map tasks=3
                Total time spent by all maps in occupied slots (ms)=12088
                Total time spent by all reduces in occupied slots (ms)=0
                Total time spent by all map tasks (ms)=12088
                Total vcore-milliseconds taken by all map tasks=12088
                Total megabyte-milliseconds taken by all map tasks=12378112
        Map-Reduce Framework
                Map input records=0
                Map output records=0
                Input split bytes=426
                Spilled Records=0
                Failed Shuffles=0
                Merged Map outputs=0
                GC time elapsed (ms)=59
                CPU time spent (ms)=1940
                Physical memory (bytes) snapshot=1031741440
                Virtual memory (bytes) snapshot=4260794368
                Total committed heap usage (bytes)=2472542208
        File Input Format Counters 
                Bytes Read=0
        File Output Format Counters 
                Bytes Written=0
17/07/25 09:28:51 INFO mapreduce.ExportJobBase: Transferred 723 bytes in 14.8595 seconds (48.6558 bytes/sec)
17/07/25 09:28:51 INFO mapreduce.ExportJobBase: Exported 0 records.
17/07/25 09:28:51 ERROR tool.ExportTool: Error during export: 
Export job failed!
        at org.apache.sqoop.mapreduce.ExportJobBase.runExport(ExportJobBase.java:439)
        at org.apache.sqoop.manager.SqlManager.updateTable(SqlManager.java:965)
        at org.apache.sqoop.tool.ExportTool.exportTable(ExportTool.java:70)
        at org.apache.sqoop.tool.ExportTool.run(ExportTool.java:99)
        at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
        at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
        at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)
        at org.apache.sqoop.Sqoop.runTool(Sqoop.java:243)
        at org.apache.sqoop.Sqoop.main(Sqoop.java:252)



Thank You
2 REPLIES 2

avatar
Mentor
Have you tried looking at the failed job logs, for the printed job task ID of task_1500463014055_0245_m_000000 which failed?

avatar
New Contributor

Check your input file, if it seperated by other than ',' value, Please use --input-fields-terminated-by <char>. It will work.

 

Let me know, incase, you have still issue.

 

Thanks,

Shashi