Archives of Support Questions (Read Only)

This is an archived board for historical reference. Information and links may no longer be available or relevant
Announcements
This board is archived and read-only for historical reference. To ask a new question, please post a new topic on the appropriate active board.

Sqoop export to oracle 11g timeout

avatar
New Contributor

 Hi Guys,

 

I am trying to export a tab seperated text file I created using pig from hdfs to a table in oracle 11g using sqoop. When I submit the sqoop job, the map job progresses from 0% to 100%, but it happens relatevely slowly (the text file consists of 9 rows and 12 columns). The job then gets stuck at 100% untill the I get a timeout error, and then gets resubmitted again.

 

Here is the sqoop command I used:

sqoop export --connect "jdbc:oracle:thin:@ipaddress:port:db" --username 'username' --password 'password' --table "tablename" --export-dir /user/project/subdirectory/filename --fields-terminated-by "\t" --verbose

I have tried running the same sqoop job multiple times and occasionally I get the following SQLRecoverableException.

Error: java.io.IOException: java.sql.SQLRecoverableException: No more data to read from socket
        at org.apache.sqoop.mapreduce.ExportBatchOutputFormat.getRecordWriter(ExportBatchOutputFormat.java:50)
        at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.<init>(MapTask.java:644)
        at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
        at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:415)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1642)
        at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)
Caused by: java.sql.SQLRecoverableException: No more data to read from socket
        at oracle.jdbc.driver.T4CMAREngineStream.unmarshalUB1(T4CMAREngineStream.java:456)
        at oracle.jdbc.driver.T4CTTIfun.receive(T4CTTIfun.java:397)
        at oracle.jdbc.driver.T4CTTIfun.doRPC(T4CTTIfun.java:257)
        at oracle.jdbc.driver.T4CTTIoauthenticate.doOAUTH(T4CTTIoauthenticate.java:437)
        at oracle.jdbc.driver.T4CTTIoauthenticate.doOAUTH(T4CTTIoauthenticate.java:954)
        at oracle.jdbc.driver.T4CConnection.logon(T4CConnection.java:639)
        at oracle.jdbc.driver.PhysicalConnection.connect(PhysicalConnection.java:666)
        at oracle.jdbc.driver.T4CDriverExtension.getConnection(T4CDriverExtension.java:32)
        at oracle.jdbc.driver.OracleDriver.connect(OracleDriver.java:566)
        at java.sql.DriverManager.getConnection(DriverManager.java:571)
        at java.sql.DriverManager.getConnection(DriverManager.java:215)
        at org.apache.sqoop.mapreduce.db.DBConfiguration.getConnection(DBConfiguration.java:302)
        at org.apache.sqoop.mapreduce.AsyncSqlRecordWriter.<init>(AsyncSqlRecordWriter.java:76)
        at org.apache.sqoop.mapreduce.ExportOutputFormat$ExportRecordWriter.<init>(ExportOutputFormat.java:95)
        at com.cloudera.sqoop.mapreduce.ExportOutputFormat$ExportRecordWriter.<init>(ExportOutputFormat.java:38)
        at org.apache.sqoop.mapreduce.ExportBatchOutputFormat$ExportBatchRecordWriter.<init>(ExportBatchOutputFormat.java:63)
        at org.apache.sqoop.mapreduce.ExportBatchOutputFormat.getRecordWriter(ExportBatchOutputFormat.java:48)
        ... 8 more

I have tried importing data into hdfs from the same table using sqoop and i have been able to do that successfully. Please let me know your thoughts on it.

 

Here is some additional information

Hadoop distribution: CDH-5.3.1-1

Sqoop version - 1.4.5-cdh5.3.1

 

Thank you for your time.

 

 

1 ACCEPTED SOLUTION

avatar
New Contributor

I was able to solve my issue. Apparently, the problem was that I was trying to insert data which was violating integrity constraints of the database. I found that by going through the syslogs for the Map job Sqoop created.

 

View solution in original post

2 REPLIES 2

avatar
New Contributor

I was able to solve my issue. Apparently, the problem was that I was trying to insert data which was violating integrity constraints of the database. I found that by going through the syslogs for the Map job Sqoop created.

 

avatar
Guru

Thank you for closing the loop with us.