Member since
03-30-2017
2
Posts
1
Kudos Received
0
Solutions
04-06-2017
09:30 AM
Hi, thanks for replying. I did not understand how I can apply this solution. Maybe I was not clear on my question. Is it possible to import data via Sqoop to a Hive table, stored as Parquet, and previously created with decimal and timestamp datatypes?
... View more
03-30-2017
06:19 AM
1 Kudo
First of all, sorry for my bad English. Guys, I've been having a problem trying to import data from my Oracle database into Hive using Sqoop. When importing the data into a previously created table, with decimal datatypes and timestamp, I get a conversion error. If I do not pre-create the table and allow sqoop to do so, it creates it with string datatypes and bigint. I tried to use the map-column-hive parameter, but with parquet it turns out to be useless. And that's where things get interesting! With the sequenceFile format, map-column-hive is respected. Briefly, I need to import the data from Oracle into Hive, with the table stored as the parquet, and maintaining decimal and timestamp datatypes. In fact, I'd like to previously create my table with varchar, char, decimal, and timestamp datatypes. How can I proceed? Should I upgrade any of my components? I use QuickStart VMs 5.8 today, I believe Hive is in version 1.1. Here are some links to similar problems: http://community.cloudera.com/t5/Data-Ingestion-Integration/SQOOP-IMPORT-map-column-hive-ignored/td-p/45369 http://community.cloudera.com/t5/Data-Ingestion-Integration/Using-Parquet-in-sqoop-import-automatically-converts-the/m-p/47339#M1996 I thank you, Alexandre.
... View more
Labels:
- Labels:
-
Apache Hive
-
Apache Sqoop