Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

Modify datatype during sqoop import

Highlighted

Modify datatype during sqoop import

Explorer

I want to import data from oracle into hdfs. Column4 is DATE field in Oracle but i want to store it as Timestamp in hdfs. I tried both Timestamp and java.sql.Timestamp but import failed.Below is the error. Please help

 

sqoop import --connect *** --username *** --password ****** --query 'select column1,column2,column3,column4 from abc where $CONDITIONS' --split-by column1 --delete-target-dir --target-dir /data/encrypt/abc --compression-codec org.apache.hadoop.io.compress.SnappyCodec --as-parquetfile --map-column-java Column4=Timestamp

 

--map-column-java Column4=Timestamp

--map-column-java Column4=java.sql.Timestamp

 

ERROR tool.ImportTool: Import failed: No ResultSet method for Java type Timestamp

 

ERROR tool.ImportTool: Import failed: Cannot convert to AVRO type java.sql.Timestamp

1 REPLY 1

Re: Modify datatype during sqoop import

Master Guru
It appears that you're trying to use Sqoop's internal handling of DATE/TIMESTAMP data types, instead of using Strings which the Oracle connector converts them to.

Have you tried the option specified at https://sqoop.apache.org/docs/1.4.6/SqoopUserGuide.html#_java_sql_timestamp?

-Doraoop.timestamp.string=false

You shouldn't need to map the column types manually in this approach.