Support Questions
Find answers, ask questions, and share your expertise
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

Modify datatype during sqoop import

Modify datatype during sqoop import


I want to import data from oracle into hdfs. Column4 is DATE field in Oracle but i want to store it as Timestamp in hdfs. I tried both Timestamp and java.sql.Timestamp but import failed.Below is the error. Please help


sqoop import --connect *** --username *** --password ****** --query 'select column1,column2,column3,column4 from abc where $CONDITIONS' --split-by column1 --delete-target-dir --target-dir /data/encrypt/abc --compression-codec --as-parquetfile --map-column-java Column4=Timestamp


--map-column-java Column4=Timestamp

--map-column-java Column4=java.sql.Timestamp


ERROR tool.ImportTool: Import failed: No ResultSet method for Java type Timestamp


ERROR tool.ImportTool: Import failed: Cannot convert to AVRO type java.sql.Timestamp


Re: Modify datatype during sqoop import

Master Guru
It appears that you're trying to use Sqoop's internal handling of DATE/TIMESTAMP data types, instead of using Strings which the Oracle connector converts them to.

Have you tried the option specified at


You shouldn't need to map the column types manually in this approach.
Don't have an account?
Coming from Hortonworks? Activate your account here