Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

Moving data to hive external table using sqoop

Highlighted

Moving data to hive external table using sqoop

Explorer

Please provide your inputs on moving data from DB2 to Hive using Sqoop.

We have more than 1 Billion rows in DB2 tables and planning to move that to hdfs and use hive table to run analytics.
Records get inserted and updated daily on these DB2 tables. So planning to move the data in two steps using sqoop. First step to load existing data and then to daily sq
oop import to move previous day's data.
In DB2 table, I am having a date column. When I move to hdfs location, I want to store them as date partitions in external table. Looks like there are limitations in using external hive table partitions.
https://issues.apache.org/jira/browse/HIVE-6589

If I create the external hive table first with partitions by date like below example

create external tables sample (ID string, name string) partitioned by (date string)
location '/sampledata/ ;

a) how to sqoop import existing data to hdfs in different date folders under /sampledata ? I have a year of data in DB2.

b) how to sqoop incremental data daily to /sampledata under new folder for that date?
I think we can achieve this using below command by passing currentdatefolder value as input to this command. 

sqoop import "" --username abc -password abc --table source_table --target-dir /sampledata/currentdatefolder -m1 --check-column modified_date --incremental lastmodified --last-value {last_import_date}

But after that, do we need to alter table to add this new folder to partitions?

ALTER TABLE sample ADD PARTITION (date='currentdatefolder') LOCATION '/sampledata/currentdatefolder';

Don't have an account?
Coming from Hortonworks? Activate your account here