Member since
07-05-2016
4
Posts
1
Kudos Received
0
Solutions
07-06-2016
06:25 AM
thanks for reply, i can do that in my use case. Let me try!
... View more
07-06-2016
06:14 AM
1 Kudo
I run into an issue when I try to merge two big tables from SQLSERVER for Sqoop to Import. I’m getting SQL exception error of rows reached its maximum limit. Please read below error message for more details. Please share your thoughts/suggestions, if you have faced it before. Error: java.io.IOException: Cannection handler cannot recover failure:
at org.apache.sqoop.mapreduce.db.SQLServerDBRecordReader.nextKeyValue(SQLServerDBRecordReader.java:169)
at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:553)
at org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:80)
at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:91)
at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
Caused by: java.io.IOException: SQLException in nextKeyValue
... View more
Labels:
- Labels:
-
Apache Hadoop
-
Apache Sqoop