<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>question SQOOP IMPORT FROM ORACLE TIMESTAMP ERROR  ORA-01840: input value not long enough for date format in Archives of Support Questions (Read Only)</title>
    <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/SQOOP-IMPORT-FROM-ORACLE-TIMESTAMP-ERROR-ORA-01840-input/m-p/194590#M71277</link>
    <description>&lt;P&gt;&lt;STRONG&gt;ORA-01840: input value not long enough for date format&lt;/STRONG&gt;&lt;/P&gt;&lt;P&gt;I have a table named DEMO_ORDERS in oracle like (image attatched)&lt;/P&gt;&lt;P&gt;&lt;STRONG&gt;SQOOP INCREMENTAL IMPORT:&lt;/STRONG&gt;&lt;/P&gt;&lt;PRE&gt;sqoop import --connect jdbc:oracle:thin:system/system@192.168.XXX.XXX:1521:xe --username sonu -password sonu --table DEMO_ORDERS --target-dir '/user/root/orders/' --as-textfile --split-by CUSTOMER_ID --incremental lastmodified --check-column ORDER_TIMESTAMP --last-value 0 --num-mappers 5 --fields-terminated-by ',';&lt;/PRE&gt;&lt;P&gt;While importing data from Oracle to HDFS using Sqoop i am getting below error..&lt;/P&gt;&lt;PRE&gt;17/11/14 01:12:37 INFO manager.OracleManager: Time zone has been set to GMT
17/11/14 01:12:37 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM DEMO_ORDERS t WHERE 1=0
17/11/14 01:12:37 INFO tool.ImportTool: Incremental import based on column ORDER_TIMESTAMP
17/11/14 01:12:37 INFO tool.ImportTool: Lower bound value: TO_TIMESTAMP('0', 'YYYY-MM-DD HH24:MI:SS.FF')
17/11/14 01:12:37 INFO tool.ImportTool: Upper bound value: TO_TIMESTAMP('2017-11-14 01:58:34.0', 'YYYY-MM-DD HH24:MI:SS.FF')
17/11/14 01:12:37 INFO mapreduce.ImportJobBase: Beginning import of DEMO_ORDERS
17/11/14 01:12:37 INFO impl.TimelineClientImpl: Timeline service address: &lt;A href="http://sandbox.hortonworks.com:8188/ws/v1/timeline/" target="_blank" rel="nofollow noopener noreferrer"&gt;http://sandbox.hortonworks.com:8188/ws/v1/timeline/&lt;/A&gt;
17/11/14 01:12:38 INFO client.RMProxy: Connecting to ResourceManager at sandbox.hortonworks.com/192.168.144.128:8050
17/11/14 01:12:39 INFO db.DBInputFormat: Using read commited transaction isolation
17/11/14 01:12:39 INFO db.DataDrivenDBInputFormat: BoundingValsQuery: SELECT MIN(CUSTOMER_ID), MAX(CUSTOMER_ID) FROM DEMO_ORDERS WHERE ( ORDER_TIMESTAMP &amp;gt;= TO_TIMESTAMP('0', 'YYYY-MM-DD HH24:MI:SS.FF') AND ORDER_TIMESTAMP &amp;lt; TO_TIMESTAMP('2017-11-14 01:58:34.0', 'YYYY-MM-DD HH24:MI:SS.FF') )
17/11/14 01:12:39 INFO mapreduce.JobSubmitter: Cleaning up the staging area /user/root/.staging/job_1510613991307_0011
17/11/14 01:12:39 ERROR tool.ImportTool: Encountered IOException running import job: java.io.IOException: java.sql.SQLDataException: ORA-01840: input value not long enough for date format&lt;/PRE&gt;&lt;P&gt;&lt;STRONG&gt;&lt;BR /&gt;&lt;/STRONG&gt;&lt;/P&gt;&lt;P&gt;&lt;STRONG&gt;HELP ME TO COME OUT OF THIS..PLZ&lt;/STRONG&gt;&lt;/P&gt;&lt;P&gt;&lt;STRONG&gt;&lt;/STRONG&gt; &lt;A rel="user" href="https://community.cloudera.com/users/5019/ssubhas.html" nodeid="5019" target="_blank"&gt;@Sindhu&lt;/A&gt; &lt;A rel="user" href="https://community.cloudera.com/users/18929/yaswanthmuppireddy.html" nodeid="18929" target="_blank"&gt;@Shu&lt;/A&gt; &lt;A rel="user" href="https://community.cloudera.com/users/2031/shivpriya-tamboskar.html" nodeid="2031" target="_blank"&gt;@Shivpriya Tamboskar&lt;/A&gt;&lt;/P&gt;&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="43584-column-datatype.jpg" style="width: 497px;"&gt;&lt;img src="https://community.cloudera.com/t5/image/serverpage/image-id/18077i5CB88ED20FF6BDF0/image-size/medium?v=v2&amp;amp;px=400" role="button" title="43584-column-datatype.jpg" alt="43584-column-datatype.jpg" /&gt;&lt;/span&gt;&lt;/P&gt;&lt;BR /&gt;&lt;IMG src="https://community.cloudera.com/t5/image/serverpage/image-id/6016i5DD2484DE5097C76/image-size/large?v=1.0&amp;amp;px=999" border="0" alt="data.jpg" title="data.jpg" /&gt;</description>
    <pubDate>Sun, 18 Aug 2019 06:45:45 GMT</pubDate>
    <dc:creator>lokashis_r</dc:creator>
    <dc:date>2019-08-18T06:45:45Z</dc:date>
    <item>
      <title>SQOOP IMPORT FROM ORACLE TIMESTAMP ERROR  ORA-01840: input value not long enough for date format</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/SQOOP-IMPORT-FROM-ORACLE-TIMESTAMP-ERROR-ORA-01840-input/m-p/194590#M71277</link>
      <description>&lt;P&gt;&lt;STRONG&gt;ORA-01840: input value not long enough for date format&lt;/STRONG&gt;&lt;/P&gt;&lt;P&gt;I have a table named DEMO_ORDERS in oracle like (image attatched)&lt;/P&gt;&lt;P&gt;&lt;STRONG&gt;SQOOP INCREMENTAL IMPORT:&lt;/STRONG&gt;&lt;/P&gt;&lt;PRE&gt;sqoop import --connect jdbc:oracle:thin:system/system@192.168.XXX.XXX:1521:xe --username sonu -password sonu --table DEMO_ORDERS --target-dir '/user/root/orders/' --as-textfile --split-by CUSTOMER_ID --incremental lastmodified --check-column ORDER_TIMESTAMP --last-value 0 --num-mappers 5 --fields-terminated-by ',';&lt;/PRE&gt;&lt;P&gt;While importing data from Oracle to HDFS using Sqoop i am getting below error..&lt;/P&gt;&lt;PRE&gt;17/11/14 01:12:37 INFO manager.OracleManager: Time zone has been set to GMT
17/11/14 01:12:37 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM DEMO_ORDERS t WHERE 1=0
17/11/14 01:12:37 INFO tool.ImportTool: Incremental import based on column ORDER_TIMESTAMP
17/11/14 01:12:37 INFO tool.ImportTool: Lower bound value: TO_TIMESTAMP('0', 'YYYY-MM-DD HH24:MI:SS.FF')
17/11/14 01:12:37 INFO tool.ImportTool: Upper bound value: TO_TIMESTAMP('2017-11-14 01:58:34.0', 'YYYY-MM-DD HH24:MI:SS.FF')
17/11/14 01:12:37 INFO mapreduce.ImportJobBase: Beginning import of DEMO_ORDERS
17/11/14 01:12:37 INFO impl.TimelineClientImpl: Timeline service address: &lt;A href="http://sandbox.hortonworks.com:8188/ws/v1/timeline/" target="_blank" rel="nofollow noopener noreferrer"&gt;http://sandbox.hortonworks.com:8188/ws/v1/timeline/&lt;/A&gt;
17/11/14 01:12:38 INFO client.RMProxy: Connecting to ResourceManager at sandbox.hortonworks.com/192.168.144.128:8050
17/11/14 01:12:39 INFO db.DBInputFormat: Using read commited transaction isolation
17/11/14 01:12:39 INFO db.DataDrivenDBInputFormat: BoundingValsQuery: SELECT MIN(CUSTOMER_ID), MAX(CUSTOMER_ID) FROM DEMO_ORDERS WHERE ( ORDER_TIMESTAMP &amp;gt;= TO_TIMESTAMP('0', 'YYYY-MM-DD HH24:MI:SS.FF') AND ORDER_TIMESTAMP &amp;lt; TO_TIMESTAMP('2017-11-14 01:58:34.0', 'YYYY-MM-DD HH24:MI:SS.FF') )
17/11/14 01:12:39 INFO mapreduce.JobSubmitter: Cleaning up the staging area /user/root/.staging/job_1510613991307_0011
17/11/14 01:12:39 ERROR tool.ImportTool: Encountered IOException running import job: java.io.IOException: java.sql.SQLDataException: ORA-01840: input value not long enough for date format&lt;/PRE&gt;&lt;P&gt;&lt;STRONG&gt;&lt;BR /&gt;&lt;/STRONG&gt;&lt;/P&gt;&lt;P&gt;&lt;STRONG&gt;HELP ME TO COME OUT OF THIS..PLZ&lt;/STRONG&gt;&lt;/P&gt;&lt;P&gt;&lt;STRONG&gt;&lt;/STRONG&gt; &lt;A rel="user" href="https://community.cloudera.com/users/5019/ssubhas.html" nodeid="5019" target="_blank"&gt;@Sindhu&lt;/A&gt; &lt;A rel="user" href="https://community.cloudera.com/users/18929/yaswanthmuppireddy.html" nodeid="18929" target="_blank"&gt;@Shu&lt;/A&gt; &lt;A rel="user" href="https://community.cloudera.com/users/2031/shivpriya-tamboskar.html" nodeid="2031" target="_blank"&gt;@Shivpriya Tamboskar&lt;/A&gt;&lt;/P&gt;&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="43584-column-datatype.jpg" style="width: 497px;"&gt;&lt;img src="https://community.cloudera.com/t5/image/serverpage/image-id/18077i5CB88ED20FF6BDF0/image-size/medium?v=v2&amp;amp;px=400" role="button" title="43584-column-datatype.jpg" alt="43584-column-datatype.jpg" /&gt;&lt;/span&gt;&lt;/P&gt;&lt;BR /&gt;&lt;IMG src="https://community.cloudera.com/t5/image/serverpage/image-id/6016i5DD2484DE5097C76/image-size/large?v=1.0&amp;amp;px=999" border="0" alt="data.jpg" title="data.jpg" /&gt;</description>
      <pubDate>Sun, 18 Aug 2019 06:45:45 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/SQOOP-IMPORT-FROM-ORACLE-TIMESTAMP-ERROR-ORA-01840-input/m-p/194590#M71277</guid>
      <dc:creator>lokashis_r</dc:creator>
      <dc:date>2019-08-18T06:45:45Z</dc:date>
    </item>
    <item>
      <title>Re: SQOOP IMPORT FROM ORACLE TIMESTAMP ERROR  ORA-01840: input value not long enough for date format</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/SQOOP-IMPORT-FROM-ORACLE-TIMESTAMP-ERROR-ORA-01840-input/m-p/194591#M71278</link>
      <description>&lt;P&gt;&lt;A rel="user" href="https://community.cloudera.com/users/48179/lokashisr.html" nodeid="48179"&gt;@LOKASHIS RANA&lt;/A&gt;&lt;A rel="user" href="https://community.cloudera.com/users/48179/lokashisr.html" nodeid="48179"&gt;&lt;/A&gt;
&lt;/P&gt;&lt;P style="display: inline !important;"&gt;&lt;/P&gt;&lt;P&gt;&lt;BR /&gt;&lt;A rel="user" href="https://community.cloudera.com/users/48179/lokashisr.html" nodeid="48179"&gt;&lt;/A&gt;
&lt;/P&gt;&lt;P style="display: inline !important;"&gt;As the logs shows ORA-01840: input value not long enough for date format &lt;/P&gt;&lt;P&gt;Try to use some value for last value like '1900-01-01 00:00:00.0' instead of 0 and run again with below sqoop import statement.&lt;/P&gt;&lt;PRE&gt;sqoop import--connect jdbc:oracle:thin:system/system@192.168.XXX.XXX:1521:xe --username sonu -password sonu --table DEMO_ORDERS --target-dir '/user/root/orders/' --as-textfile --split-by CUSTOMER_ID --incremental lastmodified --check-column ORDER_TIMESTAMP --last-value '1900-01-01 00:00:00.0'--num-mappers 5 --fields-terminated-by','&lt;/PRE&gt;&lt;P&gt;In addition &lt;BR /&gt;once make sure how the data looks like for ORDER_TIMESTAMP column in source table if it is having just date in it like&lt;/P&gt;&lt;P&gt;2011-01-01 then you need to pass last-value same like the data in source&lt;/P&gt;&lt;P&gt;&lt;STRONG&gt;&lt;U&gt;Example:- &lt;/U&gt;&lt;/STRONG&gt;&lt;BR /&gt;if source table ORDER_TIMESTAMP data looks like 2011-01-01 then&lt;/P&gt;&lt;PRE&gt;--last-value 1900-01-01 //we need to give last value same format matches with the data in source table.&lt;/PRE&gt;</description>
      <pubDate>Tue, 14 Nov 2017 05:19:46 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/SQOOP-IMPORT-FROM-ORACLE-TIMESTAMP-ERROR-ORA-01840-input/m-p/194591#M71278</guid>
      <dc:creator>Shu_ashu</dc:creator>
      <dc:date>2017-11-14T05:19:46Z</dc:date>
    </item>
    <item>
      <title>Re: SQOOP IMPORT FROM ORACLE TIMESTAMP ERROR  ORA-01840: input value not long enough for date format</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/SQOOP-IMPORT-FROM-ORACLE-TIMESTAMP-ERROR-ORA-01840-input/m-p/194592#M71279</link>
      <description>&lt;P&gt;Thanks a lot..It really worked like a charm..&lt;/P&gt;&lt;P&gt;&lt;STRONG&gt;import completes successfully..&lt;/STRONG&gt;&lt;/P&gt;&lt;P&gt;But getting a error like (ERROR ATTATCHED)&lt;/P&gt;&lt;P&gt;&lt;STRONG&gt;Error: java.lang.RuntimeException: java.lang.RuntimeException: java.sql.SQLRecoverableException: No more data to read from socket&lt;/STRONG&gt;&lt;/P&gt;&lt;P&gt;&lt;B&gt;Can you put some light on this cause ?&lt;/B&gt;&lt;/P&gt;&lt;PRE&gt;17/11/14 18:41:54 INFO mapreduce.Job: Running job: job_1510681534788_0007
17/11/14 18:42:02 INFO mapreduce.Job: Job job_1510681534788_0007 running in uber mode : false
17/11/14 18:42:02 INFO mapreduce.Job:  map 0% reduce 0%
17/11/14 18:42:30 INFO mapreduce.Job:  map 20% reduce 0%
17/11/14 18:43:01 INFO mapreduce.Job:  map 40% reduce 0%
17/11/14 18:43:02 INFO mapreduce.Job:  map 60% reduce 0%
17/11/14 18:43:14 INFO mapreduce.Job:  map 80% reduce 0%
17/11/14 18:43:35 INFO mapreduce.Job: Task Id : attempt_1510681534788_0007_m_000003_0, Status : FAILED
Error: java.lang.RuntimeException: java.lang.RuntimeException: java.sql.SQLRecoverableException: No more data to read from socket
        at org.apache.sqoop.mapreduce.db.DBInputFormat.setConf(DBInputFormat.java:167)
        at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:76)
        at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:136)
        at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:749)
        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
        at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:415)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
        at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
Caused by: java.lang.RuntimeException: java.sql.SQLRecoverableException: No more data to read from socket
        at org.apache.sqoop.mapreduce.db.DBInputFormat.getConnection(DBInputFormat.java:220)
        at org.apache.sqoop.mapreduce.db.DBInputFormat.setConf(DBInputFormat.java:165)
        ... 9 more
Caused by: java.sql.SQLRecoverableException: No more data to read from socket
        at oracle.jdbc.driver.SQLStateMapping.newSQLException(SQLStateMapping.java:281)
        at oracle.jdbc.driver.DatabaseError.newSQLException(DatabaseError.java:118)
        at oracle.jdbc.driver.DatabaseError.throwSqlException(DatabaseError.java:224)
        at oracle.jdbc.driver.DatabaseError.throwSqlException(DatabaseError.java:296)
        at oracle.jdbc.driver.DatabaseError.throwSqlException(DatabaseError.java:539)
        at oracle.jdbc.driver.T4CMAREngine.unmarshalUB1(T4CMAREngine.java:1091)
        at oracle.jdbc.driver.T4CMAREngine.unmarshalSB1(T4CMAREngine.java:1040)
        at oracle.jdbc.driver.T4CTTIoauthenticate.receiveOauth(T4CTTIoauthenticate.java:814)
        at oracle.jdbc.driver.T4CConnection.logon(T4CConnection.java:428)
        at oracle.jdbc.driver.PhysicalConnection.&amp;lt;init&amp;gt;(PhysicalConnection.java:494)
        ... 10 more


17/11/14 18:43:55 INFO mapreduce.Job:  map 100% reduce 0%
17/11/14 18:43:55 INFO mapreduce.Job: Job job_1510681534788_0007 completed successfully
17/11/14 18:43:56 INFO mapreduce.Job: Counters: 31
        File System Counters
                FILE: Number of bytes read=0
                FILE: Number of bytes written=750200
                FILE: Number of read operations=0
                FILE: Number of large read operations=0
                FILE: Number of write operations=0
                HDFS: Number of bytes read=579
                HDFS: Number of bytes written=136
                HDFS: Number of read operations=20
                HDFS: Number of large read operations=0
                HDFS: Number of write operations=10
 Job Counters
                Failed map tasks=1
                Launched map tasks=6
                Other local map tasks=6
                Total time spent by all maps in occupied slots (ms)=310816
                Total time spent by all reduces in occupied slots (ms)=0
                Total time spent by all map tasks (ms)=310816
                Total vcore-seconds taken by all map tasks=310816
                Total megabyte-seconds taken by all map tasks=77704000
        Map-Reduce Framework
                Map input records=4
                Map output records=4
                Input split bytes=579
                Spilled Records=0
                Failed Shuffles=0
                Merged Map outputs=0
                GC time elapsed (ms)=6122
                CPU time spent (ms)=8900
                Physical memory (bytes) snapshot=773640192
                Virtual memory (bytes) snapshot=4145668096
                Total committed heap usage (bytes)=665845760
        File Input Format Counters
                Bytes Read=0
        File Output Format Counters
                Bytes Written=136
17/11/14 18:43:56 INFO mapreduce.ImportJobBase: Transferred 136 bytes in 125.5593 seconds (1.0832 bytes/sec)
17/11/14 18:43:56 INFO mapreduce.ImportJobBase: Retrieved 4 records.
17/11/14 18:43:56 INFO util.AppendUtils: Appending to directory orders
17/11/14 18:43:56 INFO util.AppendUtils: Using found partition 10
17/11/14 18:43:56 INFO tool.ImportTool: Incremental import complete! To run another incremental import of all data following this import, supply the following arguments:
17/11/14 18:43:56 INFO tool.ImportTool:  --incremental lastmodified
17/11/14 18:43:56 INFO tool.ImportTool:   --check-column ORDER_TIMESTAMP
17/11/14 18:43:56 INFO tool.ImportTool:   --last-value 2017-11-14 19:29:52.0
17/11/14 18:43:56 INFO tool.ImportTool: (Consider saving this with 'sqoop job --create')

&lt;BR /&gt;&lt;/PRE&gt;&lt;P&gt;Will be connecting you whenever getting any issue..You are really a champ..Once again thank you dear..&lt;/P&gt;</description>
      <pubDate>Tue, 14 Nov 2017 22:18:16 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/SQOOP-IMPORT-FROM-ORACLE-TIMESTAMP-ERROR-ORA-01840-input/m-p/194592#M71279</guid>
      <dc:creator>lokashis_r</dc:creator>
      <dc:date>2017-11-14T22:18:16Z</dc:date>
    </item>
  </channel>
</rss>

