Support Questions

Find answers, ask questions, and share your expertise

Sqoop incremental: Output directory already exists

avatar
Explorer

Hi all!

 

I have a seemingly simple use case for Sqoop: incrementally import data from a MySQL db into HDFS. At first I tried Sqoop2, but it seems Sqoop2 doesn't support incremental imports yet. Am I correct in this? (Sqoop2 did imports fine btw)

Then I tried to use Sqoop (1) and figured out I need to create a job so Sqoop can automatically update stuff like the last value for me. This is the command I used to create a job:

 

sqoop job --create import-test -- import --connect jdbc:mysql://10.211.55.1/test_sqoop --username root -P --table test_incr_update --target-dir /user/vagrant/sqooptest --check-column updated --incremental lastmodified

 

When I run it for the first time, it works great. When I run it for the second time, I get:

 

ERROR tool.ImportTool: Encountered IOException running import job: org.apache.hadoop.mapred.FileAlreadyExistsException: Output directory hdfs://vm-cluster-node1:8020/user/vagrant/sqooptest already exists

 

I could of course remove the target dir before running the import again, but that would defeat the whole purpose of only getting the newer data and merging it with the old data (for which the old data needs to be present, I assume)!

 

Is this a (known) bug? Or am I doing something wrong? Any help would be appreciated 🙂

1 ACCEPTED SOLUTION

avatar
Expert Contributor
Also, it looks like https://issues.apache.org/jira/browse/SQOOP-1138 exists to address this concern.

View solution in original post

8 REPLIES 8

avatar
Expert Contributor
Sqoop2 does not support incremental imports just yet (https://issues.apache.org/jira/browse/SQOOP-1168).

It looks like the command you're running is creating a saved job. Have you tried just executing the saved job (http://sqoop.apache.org/docs/1.4.4/SqoopUserGuide.html#_saved_jobs)? Seems like this is achievable via: sqoop job --exec import-test.

You don't need to create a job to perform incremental imports (http://sqoop.apache.org/docs/1.4.4/SqoopUserGuide.html#_incremental_imports).

-Abe

avatar
Explorer

@abe wrote:
Sqoop2 does not support incremental imports just yet (https://issues.apache.org/jira/browse/SQOOP-1168).

It looks like the command you're running is creating a saved job. Have you tried just executing the saved job (http://sqoop.apache.org/docs/1.4.4/SqoopUserGuide.html#_saved_jobs)? Seems like this is achievable via: sqoop job --exec import-test.

You don't need to create a job to perform incremental imports (http://sqoop.apache.org/docs/1.4.4/SqoopUserGuide.html#_incremental_imports).

-Abe

sqoop job --exec import-test is actually the way I ran the saved job. As said, the first time it runs fine, the second time it complains about the output dir existing already.

The reason I used a saved job for this, is because of the promise that it will keep track of and autofill the last value.

avatar
Expert Contributor
Hey there,

It seems you shouldn't have to do this, but adding "--append" to your command might help.

-Abe

avatar
Expert Contributor
By the way, what version of CDH are you using?

avatar
Expert Contributor
Also, it looks like https://issues.apache.org/jira/browse/SQOOP-1138 exists to address this concern.

avatar
Explorer

I'm using CDH 5.0.2 together with Cloudera Manager 5.0.2. I think the SQOOP issue you linked, is exactly the problem I'm having. I shouldn't have to add --append because I'm already using lastmodified, which is the other incremental mode.

As long as SQOOP-1138 isn't fixed, SQOOP will be rather useless to me 🙂 The only alternative seems to be to export the whole database each time, and replace the old data with the new export.

avatar
New Contributor

You can use "delete-target-dir" , while running sqoop command 

 

sqoop import --connect jdbc:mysql://1.2.3.4:1234/retailfgh --username 12345 --password 12345 --table departments --delete-target-dir  --target-dir /poc/sqoop_destination --fields-terminated-by "~"

 

 

avatar
New Contributor

Any solution to this issue?

using --append in place of --lastmodified is not the correct solution as it won't update the record but create new record in hive.

 

--delete-target-dir defeats the purpose to update data as it will create new directory everytime which is same as importing entire source table into hdfs-hive everytime.

 

I tried using --merge-key but it gives following error:

 

19/03/20 07:07:41 ERROR tool.ImportTool: Import failed: java.io.IOException: Could not load jar /tmp/sqoop-gfctwnsg/compile/c63dd58c7ae7aa383d4fe8e795fd8604/FRESH.EMPLOYEERUSHI.jar into JVM. (Could not find class FRESH.EMPLOYEERUSHI.)
at org.apache.sqoop.util.ClassLoaderStack.addJarFile(ClassLoaderStack.java:92)
at com.cloudera.sqoop.util.ClassLoaderStack.addJarFile(ClassLoaderStack.java:36)
at org.apache.sqoop.tool.ImportTool.loadJars(ImportTool.java:120)
at org.apache.sqoop.tool.ImportTool.lastModifiedMerge(ImportTool.java:456)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:522)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:621)
at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:243)
at org.apache.sqoop.Sqoop.main(Sqoop.java:252)
Caused by: java.lang.ClassNotFoundException: FRESH.EMPLOYEERUSHI
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.net.FactoryURLClassLoader.loadClass(URLClassLoader.java:814)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:348)
at org.apache.sqoop.util.ClassLoaderStack.addJarFile(ClassLoaderStack.java:88)

 

 

My sqoop command is as follows:

 

sqoop import \
--connect "jdbc:oracle:thin:@oraasmwd17-scan.nam.nsroot.net:8889/GENIFRD" \
--username FRESH \
--password C1T12016 \
--table FRESH.EMPLOYEERUSHI \
--merge-key id \
--target-dir /data/gfctwnsg/staging/hive/gfctwnsg_staging/rp86813/sqoopimportdir \
--incremental lastmodified \
--check-column MODIFIED_DATE \
--last-value '2019-03-20 06:43:59.0' \

 

My source Oracle table is as follows:

1 Rushi Pradhan engineer 30000 18-MAR-19
2 abc xyz doctor 20000 18-MAR-19

 

I changed the salary of id =1 and updated corresponding date manually.

Now I want to reflect this change at hive end also.

But there it is not allowing me to update the record but to only append.