Created 03-01-2016 09:51 PM
I am using sqoop netezza database to Hive import.Successfully, hive-import when I check data type so some of the data type changed as Netezza data type:
Netezza----> HIVE
BIGINT---BIGINT
DATE---STRING
INT------INT
CHARACTER---STRING
NUMERIC----DOUBLE
TIMESTAMP----STRING
BUT I want (1) Timestamp in hive instead of String (2) numeric--double...this is right??
I might transfer more tables so what Can i do to keep my data type in hive instead of sqoop automatically transfer..
what is best solution for this???
Created 03-01-2016 10:55 PM
You can pas the following property --map-column-hive and override the default type https://sqoop.apache.org/docs/1.4.6/SqoopUserGuide.html#_controlling_type_mapping
Created 03-01-2016 10:55 PM
You can pas the following property --map-column-hive and override the default type https://sqoop.apache.org/docs/1.4.6/SqoopUserGuide.html#_controlling_type_mapping
Created 03-02-2016 02:52 PM
Thanks Artem, How can I specify in sqoop query in one command
sqoop import --connect jdbc:netezza://HOSTNAME/dbNAME --username username --password password --query 'select * from db.schema.tablename WHERE $CONDITIONS' --hive-import --target-dir /user/hive/test1 --hive-table dbname.tablename --split-by columnname
if I am using this query I can get data type numeric to double and timestamp--string I want to changed this data type in hive as well.. So can you suggest me what my sqoop query looks like?
Created 03-02-2016 03:48 PM
@mike pal please read the instructions I provided, there's an example for --map-column-java but same should apply to --map-column-hive
sqoop import ... --map-column-java id=String,value=Integer
Created 03-02-2016 03:53 PM
Thanks Artem ..really apreciated
Created 03-02-2016 03:54 PM
@mike pal no problem, let me know if that works for you and accept the answer if you're satisfied.