<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>question Avro Schema change using  COnvertRecord processor in Nifi in Archives of Support Questions (Read Only)</title>
    <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Avro-Schema-change-using-COnvertRecord-processor-in-Nifi/m-p/188601#M80399</link>
    <description>&lt;P&gt;I am using ExecuteSQL to extract data from Oracle. While retrieving the records, data type of NUMBER is getting converted to BYTES. So to keep the Number to long/double I am using "ConvertRecord" processor.&lt;/P&gt;&lt;P&gt;In "Convert Record"  processor, I choose AvroReader using embedded schema and for AvroRecordSerWriter used external schema to change the bytes datatypes to long/int. But once the processor executes, I saw that the schema is getting lost as I am not able to convert to orc in order to create Hive table. Is this is any kind of bug?&lt;/P&gt;</description>
    <pubDate>Sun, 08 Jul 2018 12:43:43 GMT</pubDate>
    <dc:creator>ishu12</dc:creator>
    <dc:date>2018-07-08T12:43:43Z</dc:date>
    <item>
      <title>Avro Schema change using  COnvertRecord processor in Nifi</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Avro-Schema-change-using-COnvertRecord-processor-in-Nifi/m-p/188601#M80399</link>
      <description>&lt;P&gt;I am using ExecuteSQL to extract data from Oracle. While retrieving the records, data type of NUMBER is getting converted to BYTES. So to keep the Number to long/double I am using "ConvertRecord" processor.&lt;/P&gt;&lt;P&gt;In "Convert Record"  processor, I choose AvroReader using embedded schema and for AvroRecordSerWriter used external schema to change the bytes datatypes to long/int. But once the processor executes, I saw that the schema is getting lost as I am not able to convert to orc in order to create Hive table. Is this is any kind of bug?&lt;/P&gt;</description>
      <pubDate>Sun, 08 Jul 2018 12:43:43 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Avro-Schema-change-using-COnvertRecord-processor-in-Nifi/m-p/188601#M80399</guid>
      <dc:creator>ishu12</dc:creator>
      <dc:date>2018-07-08T12:43:43Z</dc:date>
    </item>
    <item>
      <title>Re: Avro Schema change using  COnvertRecord processor in Nifi</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Avro-Schema-change-using-COnvertRecord-processor-in-Nifi/m-p/188602#M80400</link>
      <description>&lt;A rel="user" href="https://community.cloudera.com/users/69213/ishan1701.html" nodeid="69213"&gt;@Ishan Kumar&lt;/A&gt;&lt;P&gt;In &lt;STRONG&gt;AvroRecordSetWriter &lt;/STRONG&gt;controller service you need to select &lt;/P&gt;&lt;P&gt;&lt;STRONG&gt;Schema Write Strategy &lt;/STRONG&gt;property value&lt;/P&gt;&lt;PRE&gt;Embed Avro Schema&lt;/PRE&gt;&lt;P&gt;So that you are writing the new schema embed to the avro data file, when you use &lt;STRONG&gt;ConvertAVROToOrc&lt;/STRONG&gt; processor there will be no issues when the schema was embedded. We are going to get issues &lt;STRONG&gt;java.io.IOException: Not a data file &lt;/STRONG&gt;only when the processors are not able to find any schema in the avro data file.&lt;/P&gt;</description>
      <pubDate>Sun, 08 Jul 2018 16:52:16 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Avro-Schema-change-using-COnvertRecord-processor-in-Nifi/m-p/188602#M80400</guid>
      <dc:creator>Shu_ashu</dc:creator>
      <dc:date>2018-07-08T16:52:16Z</dc:date>
    </item>
    <item>
      <title>Re: Avro Schema change using  COnvertRecord processor in Nifi</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Avro-Schema-change-using-COnvertRecord-processor-in-Nifi/m-p/188603#M80401</link>
      <description>&lt;P&gt;Thanks Shu for the response. Here I need to provide my own schema. I created the schema in Schema registry and trying to apply the same. My intention is to convert the datatype(from binary to Long)..&lt;/P&gt;</description>
      <pubDate>Mon, 09 Jul 2018 02:42:22 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Avro-Schema-change-using-COnvertRecord-processor-in-Nifi/m-p/188603#M80401</guid>
      <dc:creator>ishu12</dc:creator>
      <dc:date>2018-07-09T02:42:22Z</dc:date>
    </item>
    <item>
      <title>Re: Avro Schema change using  COnvertRecord processor in Nifi</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Avro-Schema-change-using-COnvertRecord-processor-in-Nifi/m-p/188604#M80402</link>
      <description>&lt;A rel="user" href="https://community.cloudera.com/users/69213/ishan1701.html" nodeid="69213" target="_blank"&gt;@Ishan Kumar&lt;/A&gt;&lt;P&gt;Schema Write Strategy is used to define schema i.e. do we need to add a schema.name attribute (or) Embed Avro Schema(this  is newly defined schema in AvroSchemaRegistry in your case ) in a data file (or) etc...&lt;/P&gt;&lt;P&gt;Add &lt;STRONG&gt;schema.name &lt;/STRONG&gt;attribute to the flowfile that matches the avro schema registry name and convert record processor writes the schema that you have mentioned in the avro schema registry(i.e long type) in the output flowfile from ConvertRecord procesor.&lt;/P&gt;&lt;P&gt;&lt;STRONG&gt;&lt;U&gt;AvroSetWriter controller service configs:&lt;/U&gt;&lt;/STRONG&gt;&lt;/P&gt;&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="80430-avrosetwriter.png" style="width: 1619px;"&gt;&lt;img src="https://community.cloudera.com/t5/image/serverpage/image-id/18631i455A9AE72E0451E9/image-size/medium?v=v2&amp;amp;px=400" role="button" title="80430-avrosetwriter.png" alt="80430-avrosetwriter.png" /&gt;&lt;/span&gt;&lt;/P&gt;&lt;P&gt;With these configs you are going to have new avro data file with AvroSchemaRegistry schema embed in it.&lt;/P&gt;</description>
      <pubDate>Sun, 18 Aug 2019 07:51:41 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Avro-Schema-change-using-COnvertRecord-processor-in-Nifi/m-p/188604#M80402</guid>
      <dc:creator>Shu_ashu</dc:creator>
      <dc:date>2019-08-18T07:51:41Z</dc:date>
    </item>
    <item>
      <title>Re: Avro Schema change using  COnvertRecord processor in Nifi</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Avro-Schema-change-using-COnvertRecord-processor-in-Nifi/m-p/188605#M80403</link>
      <description>&lt;P&gt;Thanks Shu.. &lt;/P&gt;</description>
      <pubDate>Wed, 11 Jul 2018 17:53:01 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Avro-Schema-change-using-COnvertRecord-processor-in-Nifi/m-p/188605#M80403</guid>
      <dc:creator>ishu12</dc:creator>
      <dc:date>2018-07-11T17:53:01Z</dc:date>
    </item>
  </channel>
</rss>

