06-23-2014 04:27 AM
I have a seemingly simple use case for Sqoop: incrementally import data from a MySQL db into HDFS. At first I tried Sqoop2, but it seems Sqoop2 doesn't support incremental imports yet. Am I correct in this? (Sqoop2 did imports fine btw)
Then I tried to use Sqoop (1) and figured out I need to create a job so Sqoop can automatically update stuff like the last value for me. This is the command I used to create a job:
sqoop job --create import-test -- import --connect jdbc:mysql://10.211.55.1/test_sqoop --username root -P --table test_incr_update --target-dir /user/vagrant/sqooptest --check-column updated --incremental lastmodified
When I run it for the first time, it works great. When I run it for the second time, I get:
ERROR tool.ImportTool: Encountered IOException running import job: org.apache.hadoop.mapred.FileAlreadyExistsException: Output directory hdfs://vm-cluster-node1:8020/user/vagrant/sqooptest already exists
I could of course remove the target dir before running the import again, but that would defeat the whole purpose of only getting the newer data and merging it with the old data (for which the old data needs to be present, I assume)!
Is this a (known) bug? Or am I doing something wrong? Any help would be appreciated :)
06-23-2014 11:20 AM
06-24-2014 12:39 AM
Sqoop2 does not support incremental imports just yet (https://issues.apache.org/jira/browse/SQOOP-1168).
It looks like the command you're running is creating a saved job. Have you tried just executing the saved job (http://sqoop.apache.org/docs/1.4.4/SqoopUserGuide.html#_saved_jobs)? Seems like this is achievable via: sqoop job --exec import-test.
You don't need to create a job to perform incremental imports (http://sqoop.apache.org/docs/1.4.4/SqoopUserGuide.html#_incremental_imports).
sqoop job --exec import-test is actually the way I ran the saved job. As said, the first time it runs fine, the second time it complains about the output dir existing already.
The reason I used a saved job for this, is because of the promise that it will keep track of and autofill the last value.
06-25-2014 12:37 AM - edited 06-25-2014 12:38 AM
I'm using CDH 5.0.2 together with Cloudera Manager 5.0.2. I think the SQOOP issue you linked, is exactly the problem I'm having. I shouldn't have to add --append because I'm already using lastmodified, which is the other incremental mode.
As long as SQOOP-1138 isn't fixed, SQOOP will be rather useless to me :) The only alternative seems to be to export the whole database each time, and replace the old data with the new export.
01-30-2019 10:41 PM - edited 01-30-2019 10:48 PM
You can use "delete-target-dir" , while running sqoop command
sqoop import --connect jdbc:mysql://220.127.116.11:1234/retailfgh --username 12345 --password 12345 --table departments --delete-target-dir --target-dir /poc/sqoop_destination --fields-terminated-by "~"