Support Questions

Find answers, ask questions, and share your expertise
Announcements
Check out our newest addition to the community, the Cloudera Data Analytics (CDA) group hub.

sqoop export job gets failed after map task starts

Expert Contributor

hello,

I am trying to export .csv file stored in HDFS to table in MYSQL. When I execute the sqoop export command the map task start to execute and after some time it gets failed so job also fails. following is my command

sqoop export --connect jdbc:mysql://xx.xx.xx.xx/exam --username horton --password horton --table tbl3 --export-dir /data/sqoop/export --input-fields-terminated-by ',' --input-lines-terminated-by '\n'

File name.csv is stored in /data/sqoop/export in HDFS. following is the file which I want to export.

name.csv

1,xyz 
2,pqr 
3,abc

Attached image show the stack trace of sqoop command execution.

79441-export.jpg

1 REPLY 1

Hi @heta desai!
Could you add the --verbose to your sqoop command?

sqoop export --verbose --connect jdbc:mysql://xx.xx.xx.xx/exam --username horton --password horton --table tbl3 --export-dir /data/sqoop/export --input-fields-terminated-by ',' --input-lines-terminated-by '\n'
Also, let's see if you've any special character under your file.
hdfs dfs -get  /data/sqoop/export/<YOUR FILE> 
cat -A <YOUR FILE>

Hope this helps!

Take a Tour of the Community
Don't have an account?
Your experience may be limited. Sign in to explore more.