Created on 03-30-2017 06:19 AM - edited 09-16-2022 04:22 AM
First of all, sorry for my bad English.
Guys, I've been having a problem trying to import data from my Oracle database into Hive using Sqoop.
When importing the data into a previously created table, with decimal datatypes and timestamp, I get a conversion error.
If I do not pre-create the table and allow sqoop to do so, it creates it with string datatypes and bigint.
I tried to use the map-column-hive parameter, but with parquet it turns out to be useless. And that's where things get interesting!
With the sequenceFile format, map-column-hive is respected.
Briefly, I need to import the data from Oracle into Hive, with the table stored as the parquet, and maintaining decimal and timestamp datatypes. In fact, I'd like to previously create my table with varchar, char, decimal, and timestamp datatypes.
How can I proceed? Should I upgrade any of my components?
I use QuickStart VMs 5.8 today, I believe Hive is in version 1.1.
Here are some links to similar problems:
I thank you,
Alexandre.