New Contributor
Posts: 2
Registered: ‎04-06-2015
Accepted Solution

Sqoop export to oracle 11g timeout

 Hi Guys,


I am trying to export a tab seperated text file I created using pig from hdfs to a table in oracle 11g using sqoop. When I submit the sqoop job, the map job progresses from 0% to 100%, but it happens relatevely slowly (the text file consists of 9 rows and 12 columns). The job then gets stuck at 100% untill the I get a timeout error, and then gets resubmitted again.


Here is the sqoop command I used:

sqoop export --connect "jdbc:oracle:thin:@ipaddress:port:db" --username 'username' --password 'password' --table "tablename" --export-dir /user/project/subdirectory/filename --fields-terminated-by "\t" --verbose

I have tried running the same sqoop job multiple times and occasionally I get the following SQLRecoverableException.

Error: java.sql.SQLRecoverableException: No more data to read from socket
        at org.apache.sqoop.mapreduce.ExportBatchOutputFormat.getRecordWriter(
        at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.<init>(
        at org.apache.hadoop.mapred.MapTask.runNewMapper(
        at org.apache.hadoop.mapred.YarnChild$
        at Method)
        at org.apache.hadoop.mapred.YarnChild.main(
Caused by: java.sql.SQLRecoverableException: No more data to read from socket
        at oracle.jdbc.driver.T4CMAREngineStream.unmarshalUB1(
        at oracle.jdbc.driver.T4CTTIfun.receive(
        at oracle.jdbc.driver.T4CTTIfun.doRPC(
        at oracle.jdbc.driver.T4CTTIoauthenticate.doOAUTH(
        at oracle.jdbc.driver.T4CTTIoauthenticate.doOAUTH(
        at oracle.jdbc.driver.T4CConnection.logon(
        at oracle.jdbc.driver.PhysicalConnection.connect(
        at oracle.jdbc.driver.T4CDriverExtension.getConnection(
        at oracle.jdbc.driver.OracleDriver.connect(
        at java.sql.DriverManager.getConnection(
        at java.sql.DriverManager.getConnection(
        at org.apache.sqoop.mapreduce.db.DBConfiguration.getConnection(
        at org.apache.sqoop.mapreduce.AsyncSqlRecordWriter.<init>(
        at org.apache.sqoop.mapreduce.ExportOutputFormat$ExportRecordWriter.<init>(
        at com.cloudera.sqoop.mapreduce.ExportOutputFormat$ExportRecordWriter.<init>(
        at org.apache.sqoop.mapreduce.ExportBatchOutputFormat$ExportBatchRecordWriter.<init>(
        at org.apache.sqoop.mapreduce.ExportBatchOutputFormat.getRecordWriter(
        ... 8 more

I have tried importing data into hdfs from the same table using sqoop and i have been able to do that successfully. Please let me know your thoughts on it.


Here is some additional information

Hadoop distribution: CDH-5.3.1-1

Sqoop version - 1.4.5-cdh5.3.1


Thank you for your time.



New Contributor
Posts: 2
Registered: ‎04-06-2015

Re: Sqoop export to oracle 11g timeout

[ Edited ]

I was able to solve my issue. Apparently, the problem was that I was trying to insert data which was violating integrity constraints of the database. I found that by going through the syslogs for the Map job Sqoop created.


Posts: 416
Topics: 51
Kudos: 89
Solutions: 49
Registered: ‎06-26-2013

Re: Sqoop export to oracle 11g timeout

Thank you for closing the loop with us.