Reply
Highlighted
Explorer
Posts: 12
Registered: ‎02-21-2019

Modify datatype during sqoop import

I want to import data from oracle into hdfs. Column4 is DATE field in Oracle but i want to store it as Timestamp in hdfs. I tried both Timestamp and java.sql.Timestamp but import failed.Below is the error. Please help

 

sqoop import --connect *** --username *** --password ****** --query 'select column1,column2,column3,column4 from abc where $CONDITIONS' --split-by column1 --delete-target-dir --target-dir /data/encrypt/abc --compression-codec org.apache.hadoop.io.compress.SnappyCodec --as-parquetfile --map-column-java Column4=Timestamp

 

--map-column-java Column4=Timestamp

--map-column-java Column4=java.sql.Timestamp

 

ERROR tool.ImportTool: Import failed: No ResultSet method for Java type Timestamp

 

ERROR tool.ImportTool: Import failed: Cannot convert to AVRO type java.sql.Timestamp

Posts: 1,892
Kudos: 431
Solutions: 302
Registered: ‎07-31-2013

Re: Modify datatype during sqoop import

It appears that you're trying to use Sqoop's internal handling of DATE/TIMESTAMP data types, instead of using Strings which the Oracle connector converts them to.

Have you tried the option specified at https://sqoop.apache.org/docs/1.4.6/SqoopUserGuide.html#_java_sql_timestamp?

-Doraoop.timestamp.string=false

You shouldn't need to map the column types manually in this approach.
Announcements
New solutions