1973
Posts
1225
Kudos Received
124
Solutions
My Accepted Solutions
| Title | Views | Posted |
|---|---|---|
| 1978 | 04-03-2024 06:39 AM | |
| 3121 | 01-12-2024 08:19 AM | |
| 1699 | 12-07-2023 01:49 PM | |
| 2476 | 08-02-2023 07:30 AM | |
| 3464 | 03-29-2023 01:22 PM |
06-02-2021
06:22 AM
1 Kudo
@hegdemahendra The NiFi CLI toolkit [1] can help here to an extent. This toolkit provides the following NiFi-Registry capabilities: registry current-user
registry list-buckets
registry create-bucket
registry delete-bucket
registry list-flows
registry create-flow
registry delete-flow
registry list-flow-versions
registry export-flow-version
registry import-flow-version
registry sync-flow-versions
registry transfer-flow-version
registry diff-flow-versions
registry upload-bundle
registry upload-bundles
registry list-bundle-groups
registry list-bundle-artifacts
registry list-bundle-versions
registry download-bundle
registry get-bundle-checksum
registry list-extension-tags
registry list-extensions
registry list-users
registry create-user
registry update-user
registry list-user-groups
registry create-user-group
registry update-user-group
registry get-policy
registry update-policy
registry update-bucket-policy You can get a description of each by executing: <path to>/cli.sh registry sync-flow-versions -h Since you are changing FlowPersistence providers and not trying to sync flows to a new NiFi-Registry, You really can't use the above "sync-flow-versions" function. Plus, I really don't see it even in that scenario being able to accomplish your goal because you would end up with new flow ids. When you create a bucket in NiFi-Registry it is assigned a bucket if (random uuid). When you version control a Process Group (PG) in NiFi, you choose an existing bucket and it first creates a new flow id (Random UUID assigned to the flow). Then the initial version 1 of that PG flow is created and assigned to that flow id in the NiFi-Registry. Since you cannot force the flow id assigned UUID, syncing flows from registry 1 to registry 2, would not track to your version controlled PGs in your NiFI because of change in flow id. In your scenario, you would need to export all your flows (version by version and it is important you keep rack of the version fo the flow you extract). So for a flow with ID XYZ you may have 6 versions. This means you would use: registry export-flow-version I'd suggest naming the produced json file using source flow id and flow version like XYZ_v1.json, XYZ_v2.json, etc... Example: ./cli.sh registry export-flow-version -ot json -u http://<nifi-registry hostname>:<port>/ -f c97fd570-e2ef-4001-98c9-8810244b6015 -fv 1 -o /tmp/c97fd570-e2ef-4001-98c9-8810244b6015_ver1.json You should then save off your original DB. Delete all existing flows so all you have are your original buckets Then you would need to take all these exported flows and import them back in to registry after switching to your new persistence provider. Now keep in mind before importing each flow version you must first create a new flow within the correct still existing buckets. Keep track of these newly assigned flow ids and which original flow id you are importing in to them (very important) Then you MUST import each new flow in exact version 1 to version x order. If you import version 5 of flow XYZ first it will become version 1 within that new flow Id. The version persisted in the output json is not used when importing, it is assigned the next incremental version in the new flow id. Once you are done here you have a bunch of new flow ids with all your versions imported. Now you need to go edit your flow.xml.gz in NiFi. For every version controlled PG in that flow.xml.gz you will find a section that looks like this: <versionControlInformation>
<registryId>912e8161-0176-1000-ffff-ffff98135aca</registryId>
<bucketId>0cab84ff-399b-4113-9767-687e8e33e48a</bucketId>
<bucketName>bucket-name</bucketName>
<flowId>136b3ba8-bc6f-46dd-afe5-235a80ef8cfe</flowId>
<flowName>flow-name</flowName>
<flowDescription/>
<version>5</version>
</versionControlInformation> Everything here should remain the same except fro the change in "flowId" This would allow you to do a global search and replace on "<flowId>original id</flowId>" to "<flowId>new id</flowId>". Make sure you stop all NiFi nodes, put same modified flow.xml.gz on all nodes (backup original), and start NiFi nodes again. Your PGs should now be tracking to your new flows imported in your registry now backed by the gitFlowPersistenceProvider. [1] https://nifi.apache.org/docs/nifi-docs/html/toolkit-guide.html#nifi_CLI Sorry there is no automated path for this. If you found this addressed your query, please take a moment to login and click "Accept" on those solutions which assisted you. Thanks, Matt
... View more
06-01-2021
02:29 PM
1 Kudo
CountText will count lines (\r\n). QueryRecord will count # of records, even if it is two records on a line
... View more
05-27-2021
07:55 AM
what version of NiFi are you using? There are some bugs with InvokeHttp in older versions. Can you access that url from that machine via curl? There may be a networking or firewall issue. https://www.datainmotion.dev/2021/01/flank-real-time-transit-information-for.html https://www.datainmotion.dev/2021/03/using-cloudera-flow-management-powered.html https://community.cloudera.com/t5/Community-Articles/Real-Time-Stock-Processing-With-Apache-NiFi-and-Apache-Kafka/ta-p/249221 https://community.cloudera.com/t5/Community-Articles/Smart-Stocks-with-FLaNK-NiFi-Kafka-Flink-SQL/ta-p/308223
... View more
05-24-2021
12:02 PM
wrap your SQL in a view / procedure / function or other Database native grouping of statements is smartest. With running multiple sql statements you may want to use Cloudera CDE, Cloudera Machine Learning jobs, YARN Spark Jobs or Airflow. https://www.datainmotion.dev/2019/10/migrating-apache-flume-flows-to-apache_15.html https://www.datainmotion.dev/2020/12/simple-change-data-capture-cdc-with-sql.html
... View more
05-24-2021
11:58 AM
1 Kudo
Use JDK 8 or JDK 11, JDK 9 is not supported JDK 9 issues https://github.com/graphhopper/graphhopper/issues/1391
... View more
05-24-2021
11:56 AM
Some examples https://community.cloudera.com/t5/Support-Questions/How-Extract-text-from-a-multiline-flow-and-create-only-one/td-p/104706 https://nathanlabadie.com/recombining-multiline-logs-with/ https://github.com/tspannhw/EverythingApacheNiFi/blob/main/README.md
... View more
05-23-2021
06:44 AM
good suggestion ! This apps really helpful in tracking and storing location of own ... Also reference are very helpful to me. http://owntracks.org/booklet/ http://owntracks.org/booklet/tech/json/ http://osmand.net/build_it post office near me tracking N.Miller
... View more
05-21-2021
07:15 AM
Use UpdateRecord (csv reader, json writer) to change date PutDatabaseRecord to save to oracle https://www.datainmotion.dev/2020/12/simple-change-data-capture-cdc-with-sql.html https://www.datainmotion.dev/2019/10/migrating-apache-flume-flows-to-apache_15.html https://www.datainmotion.dev/2021/01/flank-real-time-transit-information-for.html https://www.datainmotion.dev/2020/12/smart-stocks-with-flank-nifi-kafka.html https://github.com/tspannhw/EverythingApacheNiFi https://www.datainmotion.dev/2021/03/processing-fixed-width-and-complex-files.html https://www.datainmotion.dev/2020/06/no-more-spaghetti-flows.html
... View more
05-12-2021
12:26 PM
Thanks @MattWho Sure, I will consider your suggestion to not running multiple nifi on the same machine. I tried the variable registry approach as well but the problem is the same with that as well that we can not use the EL parameter.
... View more
05-11-2021
05:58 AM
VMs are not optimal. Run microservices in Kafka Connect, NiFi Stateless, Flink, Spark or Python in CML, Jupyter Notebooks. SQL Stream builder. all good options
... View more