Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

Postgres to Sqoop to Hive Orc Table Using hcatalog with Dynamic Partitioning Using Date Column

Postgres to Sqoop to Hive Orc Table Using hcatalog with Dynamic Partitioning Using Date Column

Hello all,

I have the following script which works:

sudo -u hdfs sqoop import --connect jdbc:postgresql://xxxx --username foo --password bar --table upstream_lead_vendors --hcatalog-database default --hcatalog-table test_orc_sqoop_2 --create-hcatalog-table --hcatalog-storage-stanza "stored as orcfile" -m 1 --driver org.postgresql.Driver

I would like to add dynamic partitioning to my script. More specifically using the dates from date column in my postgres table. So my script would use the different dates from my postgres table for partitioning.

How should I proceed? I haven't found a lot about this online. Any help or examples would be very appreciated

Thanks,

Marc

1 REPLY 1
Highlighted

Re: Postgres to Sqoop to Hive Orc Table Using hcatalog with Dynamic Partitioning Using Date Column

@Ravi Mutyala , @Jay SenSharma (Jay SenSharma) any suggestions on the above? Having a hard time finding good examples/ressources on this.

Don't have an account?
Coming from Hortonworks? Activate your account here