<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>question merge two big tables from SQLSERVER for Sqoop Import... in Archives of Support Questions (Read Only)</title>
    <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/merge-two-big-tables-from-SQLSERVER-for-Sqoop-Import/m-p/115100#M33960</link>
    <description>&lt;P&gt;I run into an issue when I try to merge two big tables from SQLSERVER for Sqoop to Import.  I’m getting SQL exception error of rows reached its maximum limit.  Please read below error message for more details.  Please share your thoughts/suggestions, if you have faced it before.&lt;/P&gt;&lt;P&gt;Error: java.io.IOException: Cannection handler cannot recover failure:
 at org.apache.sqoop.mapreduce.db.SQLServerDBRecordReader.nextKeyValue(SQLServerDBRecordReader.java:169)
  at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:553)
  at org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:80)
  at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:91)
  at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
  at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
  at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784)
  at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
  at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163)
  at java.security.AccessController.doPrivileged(Native Method)
  at javax.security.auth.Subject.doAs(Subject.java:415)
  at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
  at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
Caused by: java.io.IOException: SQLException in nextKeyValue
&lt;/P&gt;</description>
    <pubDate>Wed, 06 Jul 2016 13:14:31 GMT</pubDate>
    <dc:creator>prashantkotkar2</dc:creator>
    <dc:date>2016-07-06T13:14:31Z</dc:date>
    <item>
      <title>merge two big tables from SQLSERVER for Sqoop Import...</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/merge-two-big-tables-from-SQLSERVER-for-Sqoop-Import/m-p/115100#M33960</link>
      <description>&lt;P&gt;I run into an issue when I try to merge two big tables from SQLSERVER for Sqoop to Import.  I’m getting SQL exception error of rows reached its maximum limit.  Please read below error message for more details.  Please share your thoughts/suggestions, if you have faced it before.&lt;/P&gt;&lt;P&gt;Error: java.io.IOException: Cannection handler cannot recover failure:
 at org.apache.sqoop.mapreduce.db.SQLServerDBRecordReader.nextKeyValue(SQLServerDBRecordReader.java:169)
  at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:553)
  at org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:80)
  at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:91)
  at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
  at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
  at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784)
  at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
  at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163)
  at java.security.AccessController.doPrivileged(Native Method)
  at javax.security.auth.Subject.doAs(Subject.java:415)
  at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
  at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
Caused by: java.io.IOException: SQLException in nextKeyValue
&lt;/P&gt;</description>
      <pubDate>Wed, 06 Jul 2016 13:14:31 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/merge-two-big-tables-from-SQLSERVER-for-Sqoop-Import/m-p/115100#M33960</guid>
      <dc:creator>prashantkotkar2</dc:creator>
      <dc:date>2016-07-06T13:14:31Z</dc:date>
    </item>
    <item>
      <title>Re: merge two big tables from SQLSERVER for Sqoop Import...</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/merge-two-big-tables-from-SQLSERVER-for-Sqoop-Import/m-p/115101#M33961</link>
      <description>&lt;P&gt;SQL Server’s limit on the row size is applicable even when you join 2 tables and select a big row.&lt;/P&gt;&lt;P&gt;Alternative solution if you dont want to change row size is you can Import the sqlserver tables into Hadoop as individual external tables. Then join them in the Hadoop side and populate actual table.&lt;/P&gt;</description>
      <pubDate>Wed, 06 Jul 2016 13:19:29 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/merge-two-big-tables-from-SQLSERVER-for-Sqoop-Import/m-p/115101#M33961</guid>
      <dc:creator>mkumar13</dc:creator>
      <dc:date>2016-07-06T13:19:29Z</dc:date>
    </item>
    <item>
      <title>Re: merge two big tables from SQLSERVER for Sqoop Import...</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/merge-two-big-tables-from-SQLSERVER-for-Sqoop-Import/m-p/115102#M33962</link>
      <description>&lt;P&gt;thanks for reply, i can do that in my use case. Let me try!&lt;/P&gt;</description>
      <pubDate>Wed, 06 Jul 2016 13:25:26 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/merge-two-big-tables-from-SQLSERVER-for-Sqoop-Import/m-p/115102#M33962</guid>
      <dc:creator>prashantkotkar2</dc:creator>
      <dc:date>2016-07-06T13:25:26Z</dc:date>
    </item>
  </channel>
</rss>

