<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>question Re: Inserting From external Data Table to Hive Table in Archives of Support Questions (Read Only)</title>
    <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Inserting-From-external-Data-Table-to-Hive-Table/m-p/150630#M48728</link>
    <description>&lt;P&gt;thanks for your brilliant detailed answer &lt;/P&gt;</description>
    <pubDate>Wed, 14 Dec 2016 17:49:17 GMT</pubDate>
    <dc:creator>oula_alshiekh</dc:creator>
    <dc:date>2016-12-14T17:49:17Z</dc:date>
    <item>
      <title>Inserting From external Data Table to Hive Table</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Inserting-From-external-Data-Table-to-Hive-Table/m-p/150626#M48724</link>
      <description>&lt;P&gt;when we use PolyBase which is sql server 2016 technique
and add an external table to a table in hive 
and we  want to insert data in this external table =&amp;gt;inserting this data in associated hive table
my question is ?
is there any limit in external table max inserted records 
i mean if iam inserting data in external table from another sql server table that has more than 30000 records 
i encounter this error

&lt;/P&gt;&lt;P&gt;Cannot execute the query "Remote Query" against OLE DB provider 
"SQLNCLI11" for linked server "SQLNCLI11". 110802;An internal DMS error 
occurred that caused this operation to fail. Details: Exception: 
Microsoft.SqlServer.DataWarehouse.DataMovement.Common.ExternalAccess.HdfsAccessException,
 Message: Java exception raised on call to 
HdfsBridge_DestroyRecordWriter: Error [0
    at 
org.apache.hadoop.hdfs.server.blockmanagement.DatanodeManager.getDatanodeStorageInfos(DatanodeManager.java:513)
    at 
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.updatePipelineInternal(FSNamesystem.java:6379)
    at 
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.updatePipeline(FSNamesystem.java:6344)
    at 
org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.updatePipeline(NameNodeRpcServer.java:822)
    at 
org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.updatePipeline(ClientNamenodeProtocolServerSideTranslatorPB.java:971)
    at 
org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
    at 
org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:616)
    at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:969)
    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2049)
    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2045)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:415)
    at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1656)
    at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2043)
] occurred while accessing external file.&lt;/P&gt;

&lt;P&gt;while inserting less than 30000 records  leads to every thing works ok and data is inserted in hive
will this error becuase of one of the reasons&lt;/P&gt;

&lt;P&gt;1- there is a limit in external  table insert records number&lt;/P&gt;

&lt;P&gt;2- there is a limit in poly base configuration&lt;/P&gt;

&lt;P&gt;3- Any other problem in hive&lt;/P&gt;</description>
      <pubDate>Tue, 13 Dec 2016 22:09:35 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Inserting-From-external-Data-Table-to-Hive-Table/m-p/150626#M48724</guid>
      <dc:creator>oula_alshiekh</dc:creator>
      <dc:date>2016-12-13T22:09:35Z</dc:date>
    </item>
    <item>
      <title>Re: Inserting From external Data Table to Hive Table</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Inserting-From-external-Data-Table-to-Hive-Table/m-p/150627#M48725</link>
      <description>&lt;A rel="user" href="https://community.cloudera.com/users/12065/oulaalshiekh.html" nodeid="12065"&gt;@oula.alshiekh@gmail.com alshiekh&lt;/A&gt;&lt;P&gt;When you say that you are able to insert less than 30K records, does that mean the same source and destination? The reason I ask is because your error points towards a permission/access issue.&lt;/P&gt;</description>
      <pubDate>Tue, 13 Dec 2016 23:32:52 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Inserting-From-external-Data-Table-to-Hive-Table/m-p/150627#M48725</guid>
      <dc:creator>mqureshi</dc:creator>
      <dc:date>2016-12-13T23:32:52Z</dc:date>
    </item>
    <item>
      <title>Re: Inserting From external Data Table to Hive Table</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Inserting-From-external-Data-Table-to-Hive-Table/m-p/150628#M48726</link>
      <description>&lt;P&gt;Polybase Configuration Issue it seems&lt;/P&gt;&lt;P&gt;&lt;A href="https://msdn.microsoft.com/en-us/library/dn935026.aspx" target="_blank"&gt;https://msdn.microsoft.com/en-us/library/dn935026.aspx&lt;/A&gt;&lt;/P&gt;&lt;P&gt;The maximum number of concurrent PolyBase queries is 32. When 32 concurrent queries are running, each query can read a maximum of 33,000 files from the external file location. The root folder and each subfolder also count as a file. If the degree of concurrency is less than 32, the external file location can contain more than 33,000 files.&lt;/P&gt;&lt;P&gt;Is this a single file or multiple files you are accessing.  What is the Hive DDL?&lt;/P&gt;&lt;P&gt;I would recommend Hive with ORC format and fewer files.&lt;/P&gt;&lt;P&gt;See:&lt;/P&gt;&lt;P&gt;&lt;A href="https://sqlwithmanoj.com/2016/06/09/polybase-error-in-sql-server-2016-row-size-exceeds-the-defined-maximum-dms-row-size-larger-than-the-limit-of-32768-bytes/" target="_blank"&gt;https://sqlwithmanoj.com/2016/06/09/polybase-error-in-sql-server-2016-row-size-exceeds-the-defined-maximum-dms-row-size-larger-than-the-limit-of-32768-bytes/&lt;/A&gt;&lt;/P&gt;&lt;P&gt;&lt;A href="https://docs.microsoft.com/en-us/azure/sql-data-warehouse/sql-data-warehouse-service-capacity-limits" target="_blank"&gt;https://docs.microsoft.com/en-us/azure/sql-data-warehouse/sql-data-warehouse-service-capacity-limits&lt;/A&gt;&lt;/P&gt;&lt;H2&gt;Loads&lt;/H2&gt;&lt;TABLE&gt;&lt;TBODY&gt;&lt;TR&gt;&lt;TH&gt;Category&lt;/TH&gt;&lt;TH&gt;Description&lt;/TH&gt;&lt;TH&gt;Maximum&lt;/TH&gt;&lt;/TR&gt;&lt;/TBODY&gt;&lt;TBODY&gt;&lt;TR&gt;&lt;TD&gt;Polybase Loads&lt;/TD&gt;&lt;TD&gt;Bytes per row&lt;/TD&gt;&lt;TD&gt;32,768

Polybase loads are limited to loading rows both smaller than 32K and cannot load to VARCHR(MAX), NVARCHAR(MAX) or VARBINARY(MAX). While this limit exists today, it will be removed fairly soon.&lt;/TD&gt;&lt;/TR&gt;&lt;/TBODY&gt;&lt;/TABLE&gt;&lt;P&gt;see:   &lt;A href="https://blogs.msdn.microsoft.com/sqlcat/2016/06/21/polybase-setup-errors-and-possible-solutions/" target="_blank"&gt;https://blogs.msdn.microsoft.com/sqlcat/2016/06/21/polybase-setup-errors-and-possible-solutions/&lt;/A&gt;&lt;/P&gt;&lt;P&gt;Check for errors on the Hadoop server.&lt;/P&gt;&lt;P&gt;Which version of HDP or HDInsight?   Hive version?   Is there any memory issues?&lt;/P&gt;&lt;P&gt;Check logs and ambari.&lt;/P&gt;</description>
      <pubDate>Tue, 13 Dec 2016 23:34:48 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Inserting-From-external-Data-Table-to-Hive-Table/m-p/150628#M48726</guid>
      <dc:creator>TimothySpann</dc:creator>
      <dc:date>2016-12-13T23:34:48Z</dc:date>
    </item>
    <item>
      <title>Re: Inserting From external Data Table to Hive Table</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Inserting-From-external-Data-Table-to-Hive-Table/m-p/150629#M48727</link>
      <description>&lt;P&gt;source sql server table ,destination is hive table&lt;/P&gt;&lt;P&gt;i haven't configured any permissions configurations in hadoop yet &lt;/P&gt;&lt;P&gt;so my problem is because of polybase limited inserted rows &lt;/P&gt;&lt;P&gt;thanks for you &lt;/P&gt;</description>
      <pubDate>Wed, 14 Dec 2016 17:48:34 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Inserting-From-external-Data-Table-to-Hive-Table/m-p/150629#M48727</guid>
      <dc:creator>oula_alshiekh</dc:creator>
      <dc:date>2016-12-14T17:48:34Z</dc:date>
    </item>
    <item>
      <title>Re: Inserting From external Data Table to Hive Table</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Inserting-From-external-Data-Table-to-Hive-Table/m-p/150630#M48728</link>
      <description>&lt;P&gt;thanks for your brilliant detailed answer &lt;/P&gt;</description>
      <pubDate>Wed, 14 Dec 2016 17:49:17 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Inserting-From-external-Data-Table-to-Hive-Table/m-p/150630#M48728</guid>
      <dc:creator>oula_alshiekh</dc:creator>
      <dc:date>2016-12-14T17:49:17Z</dc:date>
    </item>
    <item>
      <title>Re: Inserting From external Data Table to Hive Table</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Inserting-From-external-Data-Table-to-Hive-Table/m-p/150631#M48729</link>
      <description>&lt;P&gt;There are people who instead of helping what they do is confusing. This is a straight answer to the point. Congratulations and thank you.&lt;/P&gt;</description>
      <pubDate>Sat, 07 Jan 2017 03:15:35 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Inserting-From-external-Data-Table-to-Hive-Table/m-p/150631#M48729</guid>
      <dc:creator>jose_agc</dc:creator>
      <dc:date>2017-01-07T03:15:35Z</dc:date>
    </item>
  </channel>
</rss>

