Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

Who agreed with this topic

Sqoop Error table does not contain any data, but table is populated with data

Explorer

Hi All,

 

A week ago my Sqoop jobs randomly stopped working. They stopped pulling data and when I would run the job manually to debug I get the following result

 

16/08/25 18:11:31 INFO mapreduce.ImportJobBase: Transferred 48 bytes in 26.6693 seconds (1.7998 bytes/sec)
16/08/25 18:11:31 INFO mapreduce.ImportJobBase: Retrieved 0 records.

 

These tables are still populated. When I created a new table populated with data to see if I could pull that table in, I get the following error:

 

16/08/25 12:25:23 FATAL oracle.OraOopDataDrivenDBInputFormat: The table GV_OPEN_CURSOR_TEST does not contain any data. 16/08/25 12:25:23 INFO mapreduce.JobSubmitter: Cleaning up the staging area /user/mrice/.staging/job_1471956835422_0035 16/08/25 12:25:23 ERROR sqoop.Sqoop: Got exception running Sqoop: java.lang.RuntimeException: The table GV_OPEN_CURSOR_TEST does not contain any data. java.lang.RuntimeException: The table GV_OPEN_CURSOR_TEST does not contain any data. at org.apache.sqoop.manager.oracle.OraOopDataDrivenDBInputFormat.getSplits(OraOopDataDrivenDBInputFormat.java:108) at org.apache.hadoop.mapreduce.JobSubmitter.writeNewSplits(JobSubmitter.java:305) at org.apache.hadoop.mapreduce.JobSubmitter.writeSplits(JobSubmitter.java:322) at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:200) at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1307) at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1304) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1707) at org.apache.hadoop.mapreduce.Job.submit(Job.java:1304) at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1325) at org.apache.sqoop.mapreduce.ImportJobBase.doSubmitJob(ImportJobBase.java:196) at org.apache.sqoop.mapreduce.ImportJobBase.runJob(ImportJobBase.java:169) at org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:266) at org.apache.sqoop.manager.SqlManager.importTable(SqlManager.java:673) at org.apache.sqoop.manager.oracle.OraOopConnManager.importTable(OraOopConnManager.java:284) at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:507) at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:615) at org.apache.sqoop.Sqoop.run(Sqoop.java:143) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70) at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:179) at org.apache.sqoop.Sqoop.runTool(Sqoop.java:218) at org.apache.sqoop.Sqoop.runTool(Sqoop.java:227) at org.apache.sqoop.Sqoop.main(Sqoop.java:236)

 

Does anybody have experience with this? Below is my sqoop command:

 

sqoop import --connect jdbc:oracle:thin:@server1:1521/serverprod --username HDP --password-file /user/mrice/audpw.txt --table OPEN_CURSOR --target-dir /data/audit/tempfiles/OPEN_CURSOR --fields-terminated-by '\t' --num-mappers 20 --compression-codec org.apache.hadoop.io.compress.SnappyCodec --delete-target-dir --compress --direct

 

Thanks

Who agreed with this topic