Welcome to the Cloudera Community

Announcements
Celebrating as our community reaches 100,000 members! Thank you!

Who agreed with this topic

Errors with SQOOP after moving from CDH 5.4.8 to CDH 5.5.1

avatar
Explorer

We recently moved from CDH 5.4.8 to CDH 5.5.1 and SQOOP jobs moving data from ORACLE to HDFS no longer work.

 

When I run this SQOOP job:

sqoop import --connect {CONNECTION} \
--username {USERNAME} \
--password {PASSWORD} \
--verbose \
--table {ORACLE DB TABLE} \
--where "orig_dt>=to_date('2015-12-01 00:00', 'YYYY-MM-DD HH24:mi') and orig_dt<to_date('2015-12-01 00:01', 'YYYY-MM-DD HH24:mi')" \
-z \
--compression-codec org.apache.hadoop.io.compress.SnappyCodec \
--as-parquetfile \
--target-dir {HDFS DIRECTORY} \
--split-by ORIG_DT \
-m 20

 

I get the errors:

FATAL [IPC Server handler 12 on 42706] org.apache.hadoop.mapred.TaskAttemptListenerImpl: Task: attempt_1450130973913_0017_m_000011_0 - exited : java.io.IOException: SQLException in nextKeyValue

 

Caused by: java.sql.SQLDataException: ORA-01861: literal does not match format string

 

But when I run a freeform version of the same SQOOP job:

sqoop import --connect {CONNECTION} \
--username {USERNAME} \
--password {PASSWORD} \
--verbose \
--query "SELECT * FROM {ORACLE DB TABLE} WHERE ORIG_DT>=TO_DATE('2015-12-01 00:00', 'YYYY-MM-DD HH24:mi') AND ORIG_DT<TO_DATE('2015-12-01 00:01', 'YYYY-MM-DD HH24:mi') AND \$CONDITIONS" \
-z \
--compression-codec org.apache.hadoop.io.compress.SnappyCodec \
--as-parquetfile \
--target-dir {HDFS DIRECTORY} \
--split-by ORIG_DT \
-m 20

 

I get the error:

FATAL [IPC Server handler 9 on 52286] org.apache.hadoop.mapred.TaskAttemptListenerImpl: Task: attempt_1450130973913_0018_m_000011_0 - exited : org.kitesdk.data.IncompatibleSchemaException: The type cannot be used to read from or write to the dataset:  

 If anyone has some insight as to whats causing this I would appreciate the help.

Who agreed with this topic