Created 06-27-2018 02:21 PM
I am getting this error at the end but its still loading the data in hbase fine so what does this error mean for this load 'cannot append files to target directory' ?
sqoop command
sqoop job -Dmapreduce.job.user.classpath.first=true --create incjob -- import --connect "jdbc:oracle:thin:@(description=(address=(protocol=tcp)(host=patronQA)(port=1526))(connect_data=(service_name=patron)))" --username PATRON --incremental append --check-column INSERT_TIME --table PATRON.UFM_VIEW -split-by UFM_VIEW.UFMID --target-dir /user/root/_sqoop --hbase-table UFM --column-family F1 --hbase-row-key "UFMID" --columns "UFMID,LANEUFMSEQNO,LANEID,PLAZAID,TXNTM,TIP_ID,TIPUFMSEQ,INSERT_TIME"
File Input Format Counters Bytes Read=0 File Output Format Counters Bytes Written=0 18/06/27 10:05:48 INFO mapreduce.ImportJobBase: Transferred 0 bytes in 147.9979 seconds (0 bytes/sec) 18/06/27 10:05:48 INFO mapreduce.ImportJobBase: Retrieved 999 records. 18/06/27 10:05:48 WARN util.AppendUtils: Cannot append files to target dir; no such directory: _sqoop/c81a737093c64d4492c58671affe31fe_PATRON.UFM_VIEW 18/06/27 10:05:48 INFO tool.ImportTool: Saving incremental import state to the metastore 18/06/27 10:05:49 INFO tool.ImportTool: Updated data for job: incjob [hdfs@hadoop1 ~]$
hbase gets the data fine
hbase(main):001:0> hbase(main):002:0* count 'UFM',INTERVAL => 20000 999 row(s) in 0.3320 seconds => 999 hbase(main):003:0>
Created 06-27-2018 03:55 PM
This is just a warning message not an error. This can occur when both target dir and -append are specified with HBase.
https://github.com/apache/sqoop/blob/trunk/src/java/org/apache/sqoop/util/AppendUtils.java#L77-L80
.
-Aditya
Created 06-27-2018 03:55 PM
This is just a warning message not an error. This can occur when both target dir and -append are specified with HBase.
https://github.com/apache/sqoop/blob/trunk/src/java/org/apache/sqoop/util/AppendUtils.java#L77-L80
.
-Aditya
Created 06-27-2018 04:03 PM
but doesn't this import creates files in hdfs somewhere or the data is moved directly in hbase?
Created 06-27-2018 04:16 PM
Since you have specified --hbase-table , it will import into hbase rather than HDFS.
Ref : https://sqoop.apache.org/docs/1.4.2/SqoopUserGuide.html#_importing_data_into_hbase