Member since
11-14-2019
51
Posts
6
Kudos Received
3
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
1205 | 05-26-2023 08:40 AM | |
1366 | 05-22-2023 02:38 AM | |
1229 | 05-15-2023 11:25 AM |
05-22-2023
06:40 AM
Congratulations on solving your issue @SandyClouds and thank you for posting the solution in case it can assist others.
... View more
05-22-2023
03:20 AM
When you write a website with some links to other pages and maybe a form or two, you've already done real REST. The client (i.e. a web browser) presents the current resource to a user, automatically discovers related resources and allows the user to create/edit their own resources. Why people don't apply the same principles to the APIs they write is baffling.
... View more
05-15-2023
12:09 PM
1 Kudo
Congratulations on solving your issue @SandyClouds and thanks for taking the time to share it with the community.
... View more
05-10-2023
11:37 PM
1 Kudo
@SandyClouds, I do not have a template because I no longer have access to that project but I have provided you will all the info you need to develop your own system 🙂 And it mostly depends on your use case. You can use any API you require for your use case. But in terms of processors your require the following: InvokeHTTP to perform the API Call, EvaluateJSONPath + SplitJSON to extract the relevant lines out of your bulletin boards and RouteOnAttribute to identify the errors you need. Other than that, your imagination is your best friend.
... View more
04-26-2023
09:05 AM
What is your run schedule time for the processor ? if its too low (say 1 sec or 30 sec), gtry increasing it to reasonable timeline like a minute based on your refresh requirements. @Gagan1707
... View more
04-26-2023
07:40 AM
I am looking for some suggestions and knowledge on streaming job failures and restarts. I created a Nifi flow that captures changes in Source DB and push them to destination DB (a DWH) the changes were captured using capturechangeMysql processor.. which runs 24/7 , so i call it similar to a streaming service.. During development whenever the job fails for what ever reasons, I stop the job, clear its state and fixes issues and reruns job again. But when it goes live, whenever there is a failure of job, between the time i fix and rerun the job, all the changes that were happeneing to source were missed or lost.. imagine if the complex issue took a day to fix, all the data for 24 hours would be lost at destination. I would like to know about different options on how to handle these situations and nullify data loss ? I tried the option of "Retrieve All Records" in the CaptureChangeMySQL 1.18.0 processor. but it is filling up entire queues and taking forver to process latest records. Can some please throw some light here..
... View more
Labels:
- Labels:
-
Apache NiFi
03-22-2023
03:54 AM
@mburgess @SamCloud convert jsontosql is giving SQL but with '?' marks. example: "DELETE FROM tab1 WHERE id = ? AND feature_name = ? AND state = ? AND tenant_id = ?" When i print via putfile, the json has valid values.. [{"id":1,"feature_name":"fet1","state":16,"tenant_id":"abc123"}] Want to understand where I am doing it wrong.. Please help
... View more
03-21-2023
02:46 AM
Hi Team, I am desperately looking for a helping hand here. My flow will infuse/replicate data from different small databases into a single big database. I am using changecapturemysql processor to capture changes in small databases and putdatabase record to put into destination big database. The table structure is same for all databases. but in order to differentiate in the big database there will be an additional column. Example: (let me take only a simple and single table in all databases) in sources, city_name is the PK. in destination, city_name and db_name both will form the PK. DB1.city table city_name New york Mumbai DB2.city table city_name Paris New york Destination DB city table city_name db_name New york. DB1 Mumbai DB1 Paris DB2 New york. DB2 I am able to capture the Inserts and Updates and put them in destination database. but when I delete a record in source, nothing is happening in the destination database. There is no error also. How can I achieve that when i delete a record in any source DB, it needs to delete that particular record in destination DB. How to make nifi identify the record properly to delete. I also wonder how Nifi able to identify the record that need to be updated but not the record that need to be deleted. Please help.
... View more
Labels:
- Labels:
-
Apache NiFi
03-16-2023
10:13 PM
Yes this is best approach, Inserting all in staging table and then using merge to insert/update on target table, Just need to truncate staging table for next round of inserts.
... View more
- « Previous
-
- 1
- 2
- Next »