<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>question Bulk Load SQL Server data into MySQL apache Nifi in Support Questions</title>
    <link>https://community.cloudera.com/t5/Support-Questions/Bulk-Load-SQL-Server-data-into-MySQL-apache-Nifi/m-p/190452#M152541</link>
    <description>&lt;P&gt;I have billions of rows in SQL Server tables and I'm using Nifi to load those rows into MySQL. I tried using putSQL and PutDatabaseRecord both are satisfying the requirement, however they are taking quite long time to load the into MySQL(100k records per minute, for 1 billion it might be 1000+ minutes) as they are doing it record by record. Do we have any bulk load option to load the AVRO/CSV flowfiles into MySQL in Nifi. Could you please suggest an option.&lt;/P&gt;&lt;P&gt;Here is my Flow :&lt;/P&gt;&lt;P&gt;ListDataBaseTables -&amp;gt; GenerateTableFecth(partition size= 50k records) -&amp;gt; Execute SQL -&amp;gt; ConvertRecord(Avro to CSV) -&amp;gt; PutSQL&lt;/P&gt;&lt;P&gt;ListDataBaseTables -&amp;gt; GenerateTableFecth(partition size= 50k records) -&amp;gt; Execute SQL(Avro) -&amp;gt; PutDatabaseRecord.&lt;/P&gt;</description>
    <pubDate>Wed, 21 Mar 2018 14:19:23 GMT</pubDate>
    <dc:creator>ramesh_ganginen</dc:creator>
    <dc:date>2018-03-21T14:19:23Z</dc:date>
  </channel>
</rss>

