Please provide your inputs on moving data from DB2 to Hive using Sqoop.
We have more than 1 Billion rows in DB2 tables and planning to move that to hdfs and use hive table to run analytics. Records get inserted and updated daily on these DB2 tables. So planning to move the data in two steps using sqoop. First step to load existing data and then to daily sqoop import to move previous day's data. In DB2 table, I am having a date column. When I move to hdfs location, I want to store them as date partitions in external table. Looks like there are limitations in using external hive table partitions. https://issues.apache.org/jira/browse/HIVE-6589
If I create the external hive table first with partitions by date like below example
create external tables sample (ID string, name string) partitioned by (date string) location '/sampledata/ ;
a) how to sqoop import existing data to hdfs in different date folders under /sampledata ? I have a year of data in DB2.
b) how to sqoop incremental data daily to /sampledata under new folder for that date? I think we can achieve this using below command by passing currentdatefolder value as input to this command.