<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>question Re: ConsumeKafka to PuthiveQL can't send data in NiFi Directly in Archives of Support Questions (Read Only)</title>
    <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/ConsumeKafka-to-PuthiveQL-can-t-send-data-in-NiFi-Directly/m-p/223583#M82388</link>
    <description>&lt;A rel="user" href="https://community.cloudera.com/users/91299/hariprasanthmadhavan.html" nodeid="91299"&gt;@Hariprasanth Madhavan&lt;/A&gt;&lt;P&gt;&lt;STRONG&gt;PutHiveQL processor is used to:&lt;/STRONG&gt;&lt;/P&gt;&lt;P&gt;Executes a HiveQL DDL/DML command (UPDATE, INSERT, e.g.). The content of an incoming FlowFile is expected to be the HiveQL command to execute.&lt;/P&gt;&lt;P&gt;-&lt;/P&gt;&lt;P&gt;If you want to insert data into hive table directly then use &lt;STRONG&gt;PutHiveStreaming &lt;/STRONG&gt;processor instead of &lt;STRONG&gt;PutHiveQL&lt;/STRONG&gt;.&lt;/P&gt;&lt;P&gt;&lt;STRONG&gt;Puthivestreaming &lt;/STRONG&gt;processor expects the incoming data in AVRO format and table needs to &lt;STRONG&gt;Transactional enabled&lt;/STRONG&gt;, so based on the KafkaConsumer format of data use ConvertRecord processor to Convert the source data into AVRO format then feed the Avro data into PutHiveStreaming processor.&lt;/P&gt;&lt;P&gt;&lt;STRONG&gt;Flow:&lt;/STRONG&gt;&lt;/P&gt;&lt;PRE&gt;1.ConsumeKafka
2.ConvertRecord //convert the outgoing flowfile into AVRO format
3.PutHiveStreaming&lt;/PRE&gt;&lt;P&gt;Refer to &lt;A href="https://cwiki.apache.org/confluence/display/Hive/Hive+Transactions" target="_blank"&gt;this&lt;/A&gt; link for hive transactional tables and &lt;A href="https://community.hortonworks.com/articles/115311/convert-csv-to-json-avro-xml-using-convertrecord-p.html" target="_blank"&gt;this&lt;/A&gt; link for ConvertRecord processor usage.&lt;/P&gt;&lt;P&gt;-&lt;/P&gt;&lt;P&gt;If the Answer helped to resolve your issue, &lt;STRONG&gt;Click on Accept button below to accept the answer&lt;/STRONG&gt;, That would be great help to Community users to find solution quickly for these kind of issues.&lt;/P&gt;</description>
    <pubDate>Tue, 21 Aug 2018 02:49:30 GMT</pubDate>
    <dc:creator>Shu_ashu</dc:creator>
    <dc:date>2018-08-21T02:49:30Z</dc:date>
    <item>
      <title>ConsumeKafka to PuthiveQL can't send data in NiFi Directly</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/ConsumeKafka-to-PuthiveQL-can-t-send-data-in-NiFi-Directly/m-p/223582#M82387</link>
      <description>&lt;P&gt;HDP-2.6.3.0 NIFI 1.60&lt;/P&gt;&lt;P style="margin-left: 40px;"&gt;How to insert data into the hive table from Kafka consumer?&lt;/P&gt;&lt;P style="margin-left: 40px;"&gt;KafkaConsumer -----&amp;gt;PutHiveQL&lt;/P&gt;&lt;P style="margin-left: 40px;"&gt;&lt;/P&gt;</description>
      <pubDate>Mon, 20 Aug 2018 11:41:27 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/ConsumeKafka-to-PuthiveQL-can-t-send-data-in-NiFi-Directly/m-p/223582#M82387</guid>
      <dc:creator>hariprasanthmad</dc:creator>
      <dc:date>2018-08-20T11:41:27Z</dc:date>
    </item>
    <item>
      <title>Re: ConsumeKafka to PuthiveQL can't send data in NiFi Directly</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/ConsumeKafka-to-PuthiveQL-can-t-send-data-in-NiFi-Directly/m-p/223583#M82388</link>
      <description>&lt;A rel="user" href="https://community.cloudera.com/users/91299/hariprasanthmadhavan.html" nodeid="91299"&gt;@Hariprasanth Madhavan&lt;/A&gt;&lt;P&gt;&lt;STRONG&gt;PutHiveQL processor is used to:&lt;/STRONG&gt;&lt;/P&gt;&lt;P&gt;Executes a HiveQL DDL/DML command (UPDATE, INSERT, e.g.). The content of an incoming FlowFile is expected to be the HiveQL command to execute.&lt;/P&gt;&lt;P&gt;-&lt;/P&gt;&lt;P&gt;If you want to insert data into hive table directly then use &lt;STRONG&gt;PutHiveStreaming &lt;/STRONG&gt;processor instead of &lt;STRONG&gt;PutHiveQL&lt;/STRONG&gt;.&lt;/P&gt;&lt;P&gt;&lt;STRONG&gt;Puthivestreaming &lt;/STRONG&gt;processor expects the incoming data in AVRO format and table needs to &lt;STRONG&gt;Transactional enabled&lt;/STRONG&gt;, so based on the KafkaConsumer format of data use ConvertRecord processor to Convert the source data into AVRO format then feed the Avro data into PutHiveStreaming processor.&lt;/P&gt;&lt;P&gt;&lt;STRONG&gt;Flow:&lt;/STRONG&gt;&lt;/P&gt;&lt;PRE&gt;1.ConsumeKafka
2.ConvertRecord //convert the outgoing flowfile into AVRO format
3.PutHiveStreaming&lt;/PRE&gt;&lt;P&gt;Refer to &lt;A href="https://cwiki.apache.org/confluence/display/Hive/Hive+Transactions" target="_blank"&gt;this&lt;/A&gt; link for hive transactional tables and &lt;A href="https://community.hortonworks.com/articles/115311/convert-csv-to-json-avro-xml-using-convertrecord-p.html" target="_blank"&gt;this&lt;/A&gt; link for ConvertRecord processor usage.&lt;/P&gt;&lt;P&gt;-&lt;/P&gt;&lt;P&gt;If the Answer helped to resolve your issue, &lt;STRONG&gt;Click on Accept button below to accept the answer&lt;/STRONG&gt;, That would be great help to Community users to find solution quickly for these kind of issues.&lt;/P&gt;</description>
      <pubDate>Tue, 21 Aug 2018 02:49:30 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/ConsumeKafka-to-PuthiveQL-can-t-send-data-in-NiFi-Directly/m-p/223583#M82388</guid>
      <dc:creator>Shu_ashu</dc:creator>
      <dc:date>2018-08-21T02:49:30Z</dc:date>
    </item>
  </channel>
</rss>

