![화면 캡처 2022-07-14 164058.png 화면 캡처 2022-07-14 164058.png](https://community.cloudera.com/t5/image/serverpage/image-id/34871iF5AD593AFE367474/image-size/large?v=v2&px=999)
Here is my current Nifi workflow.
1. Getfile bring the CSV file into the flow
2. As you can see, they are saved in separated tables named 'cattle_auction_data' and 'cattle_auction_invalid_data' after going through UpdateRecord processor to add a column with the original filename.
Here is what I am trying to do. The Query Record processor performs two sql queries, 'valid' and 'invalid'
![화면 캡처 2022-07-14 164926.png 화면 캡처 2022-07-14 164926.png](https://community.cloudera.com/t5/image/serverpage/image-id/34873i29E0C63DC002CF40/image-size/large?v=v2&px=999)
I would like to log the sql queries and their time of execution, make them go through updaterecord processor to add a column with the filename and save them into the table named 'validation-log' . that the final product looks like below
filename | sql_query | time_of_execution
2022.csv | select * from FLOWFILE where gradenm= '999'... | 22-07-14 07:33:26
2022.csv | select * from FLOWFILE where gradenm<> '999'.. | 22-07-14 07:33:26
I think that 'executequery.query.executiontime' attribute can deal with the time of execution part, but I don't know how to log the sqlquery and eventually save them into a table. A lot of awesome people gave me answers in previous posts but since I am a NB to Nifi I had to ask another question here. Thanks for the help in advance