<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>question Re: How to run certain MySQL queries based on records in a FlowFile? in Support Questions</title>
    <link>https://community.cloudera.com/t5/Support-Questions/How-to-run-certain-MySQL-queries-based-on-records-in-a/m-p/384208#M245271</link>
    <description>&lt;P&gt;Hi &lt;a href="https://community.cloudera.com/t5/user/viewprofilepage/user-id/109186"&gt;@iriszhuhao&lt;/a&gt; ,&lt;/P&gt;&lt;P&gt;I think you are trying to address to issues in one processor and I dont think the PutDatabaseRecord processor can do that:&lt;/P&gt;&lt;P&gt;1- Insert\update the records in Table1&lt;/P&gt;&lt;P&gt;2- Add changes log to Table2&lt;/P&gt;&lt;P&gt;Lets address Problem 1 first with Table1, I think the PutDatabaseRecord can handle multiple records in flowfile if you have a valid json containing all records as an array , in this case you can try setting up the &lt;STRONG&gt;Data Record Path&lt;/STRONG&gt; property and depending on what version you are using , you can set also the "&lt;STRONG&gt;Statement Type Record Path&lt;/STRONG&gt;" to determine what type of sql statement for each record (UPDATE, INSERT....etc.). This means you have to know before hand what statements applies to what record&amp;nbsp; and that information is set alongside each record as part of&amp;nbsp; the whole json flowfile. If you dont have the statement type attach to each record or you dont have the records as json array , i.e. each json record is separated with new line then you have to split those records using SplitText processor and then do a lookup to find if the record exist or not in the database to decide the proper sql statement. For the lookup you can use something LookupREcord processor with SimpleDatabaseLookupService.&lt;/P&gt;&lt;P&gt;Regarding Problem 2 for adding change log to table2 , you have few options:&lt;/P&gt;&lt;P&gt;1- After you execute PutDatabaseRecord with the SQL statement provided as an attribute when you did the LookRecord to see if the record exists (update) or not (insert) then you can use the same attribute to write the proper SQL script ( or create sql stored proc)&amp;nbsp; to add the change log and then use that SQL in the &lt;STRONG&gt;PutSQL&lt;/STRONG&gt; &lt;STRONG&gt;processor&lt;/STRONG&gt; in the &lt;STRONG&gt;SQL Statement&lt;/STRONG&gt; property. In this case you have to extract record fields into attribute as well to provide to the SQL statement with the field_id, old_value &amp;amp; new_value so that it can check what value has changed or not, or pass the whole json record to sql and do the parsing there instead. This can be cumbersome if you have too many fields.&lt;/P&gt;&lt;P&gt;2- Instead of handling this in Nifi , you can define Update\Insert Trigger on Table 1 , then define the logic on the trigger to add proper log information depending on the old vs new values that get captured with the trigger. Again this might get cumbersome if you have too many fields to check and it might impact performance when updating data in table 1 but its cleaner than the first option.&lt;/P&gt;&lt;P&gt;3- Using CDC (Change Data Capture) ,&amp;nbsp; I know this comes out of the box with SQL server but I think for mysql you need to use a third party tool in order to set it up. I cant give you much details on how to because I dont know myself but you can research it. The idea here is to let CDE capture what has changed\inserted\deleted and then you can read that log info and store it in whatever format\table you desire. Here you dont have to check what field has changed because the CDC will do that for you.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Hope that helps. If it does, please let me know if you have any questions otherwise &lt;STRONG&gt;please accept&lt;/STRONG&gt; the solution.&lt;/P&gt;&lt;P&gt;Thanks&lt;/P&gt;</description>
    <pubDate>Wed, 28 Feb 2024 18:35:50 GMT</pubDate>
    <dc:creator>SAMSAL</dc:creator>
    <dc:date>2024-02-28T18:35:50Z</dc:date>
  </channel>
</rss>

