- Subscribe to RSS Feed
- Mark Question as New
- Mark Question as Read
- Float this Question for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page
Modify datatype during sqoop import
- Labels:
-
Apache Hive
-
Apache Sqoop
Created on 02-22-2019 02:11 PM - edited 09-16-2022 07:11 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I want to import data from oracle into hdfs. Column4 is DATE field in Oracle but i want to store it as Timestamp in hdfs. I tried both Timestamp and java.sql.Timestamp but import failed.Below is the error. Please help
sqoop import --connect *** --username *** --password ****** --query 'select column1,column2,column3,column4 from abc where $CONDITIONS' --split-by column1 --delete-target-dir --target-dir /data/encrypt/abc --compression-codec org.apache.hadoop.io.compress.SnappyCodec --as-parquetfile --map-column-java Column4=Timestamp
--map-column-java Column4=Timestamp
--map-column-java Column4=java.sql.Timestamp
ERROR tool.ImportTool: Import failed: No ResultSet method for Java type Timestamp
ERROR tool.ImportTool: Import failed: Cannot convert to AVRO type java.sql.Timestamp
Created 03-07-2019 09:08 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Have you tried the option specified at https://sqoop.apache.org/docs/1.4.6/SqoopUserGuide.html#_java_sql_timestamp?
-Doraoop.timestamp.string=false
You shouldn't need to map the column types manually in this approach.
