sqoop import --incremental lastmodified --check-column [check-column] --last-value '2016-01-01 00:00:00.0' --merge-key ck --connect [jdbc-string] --username [username] -password [password] --table [schema].[table] --target-dir [target-directory] --fields-terminated-by '\001' --hive-drop-import-delims --null-string '\\N' --null-non-string '\\N' -m 1
16/08/24 07:24:24 INFO mapreduce.ImportJobBase: Transferred 537.9043 KB in 28.3187 seconds (18.9947 KB/sec). 16/08/24 07:24:24 INFO mapreduce.ImportJobBase: Retrieved 7009 records. 16/08/24 07:24:24 ERROR tool.ImportTool: Encountered IOException running import job: java.io.IOException: Could not load jar /tmp/sqoop-qxo3856/compile/66a565bb3751337e891778b2779994c5/BDGWK_READ.V_BDGWK_T_GW_ANTPOS_RVLINK.jar into JVM. (Could not find class BDGWK_READ.V_BDGWK_T_GW_ANTPOS_RVLINK.) Caused by: java.lang.ClassNotFoundException: BDGWK_READ.V_BDGWK_T_GW_ANTPOS_RVLINK
The map reduce program run in the background for sqoop is fetching correct number of rows as expected, the import tool is not able find the class of the table and not able to load the data to the directory.
Initially we had a table, but it didn't had primary key,so we created view of the table and added a column for composite primary key where it is the concatenation of 3 column data and it is unique for all the records.
Now we are using view of the table.
It executes successfully when we change the target-directory location every time, when we run the command or at the first time when we run it executes successfully, but if we execute second time, the command with the same directory location. it gives error as stated above.
I am facing the same issue. After executing for second time with same dir, it fails on same error.
After specifying different tagret-dir, it executes sucessfully, but still retrieves no rows although rows in original table were updated
sqoop import --connect jdbc:oracle:thin:@//$host --username admin --password admin --table INKREMENTAL_TEST --check-column DATUM --incremental lastmodified --last-value 2016-10-30 --merge-key id --target-dir /tmp/inkrementalTabOrcfixerr -m1 -z
Creating and executing it as a job did not help.
As given by @bhagan please use --append and to avoid any manual inputs i.e. giving last modified value i would like you to make use of SQOOP JOB feature with the help of SQOOP METACONNECT for this you need to configure sqoop metastore with this you don't have to hard code the values for the incremental imports.