Member since
02-01-2022
269
Posts
95
Kudos Received
59
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
1907 | 06-12-2024 06:43 AM | |
2665 | 04-12-2024 06:05 AM | |
1977 | 12-07-2023 04:50 AM | |
1177 | 12-05-2023 06:22 AM | |
2076 | 11-28-2023 10:54 AM |
07-20-2022
05:02 AM
@AbhishekSingh It looks like the issue with the Replace Expression Language was just the ` around the database and table name. That works for me as follows: ${query:replace('`nrpuserorgdb2306`.`status`', '`nrpreportdb`.`user_org_status`')} Here is my flow to test: Notice the use of a simple flow to test the concept. I also set query in GenerateFlowfile and define a separate query2 in UpdateAttribute. I always work in ways that prove functionality, then when the concept is working, take the lesson learned into real flows. Some additional Screen Shots: Template and Flow Definition File on my GitHub: https://github.com/cldr-steven-matison/NiFi-Templates
... View more
06-23-2022
09:47 AM
@araujo It would be awesome if you could also link a flow definition file....
... View more
06-13-2022
06:38 AM
1 Kudo
I have done similar here when I need to deliver jar files to all nodes. It's really a "this is not how things are done", but in this case I did not have access to the node's file system without doing this in a flow. So that said, it works great! The first proc creates a flowfile on all nodes (even when I dont know the number), then it checks, if not found, proceeds to get the file and write it to the file system.
... View more
06-07-2022
05:20 AM
Fun with python, you are going to need to resolve all dependencies. I am not familiar with the last error, but its definitely saying psycopg2 is not found..
... View more
06-01-2022
10:17 AM
1 Kudo
Yes, not available before 1.16. Definitely a great new feature!!
... View more
06-01-2022
07:48 AM
1 Kudo
@leandrolinof I believe you are looking for a brand new nifi feature found in 1.16 which allows you to control failure and retry: Framework Level Retry now supported. For many years users build flows in various ways to make retries happen for a configured number of attempts. Now this is easily and cleanly configured in the UI/API and simplifies the user experience and flow design considerably! To those waiting for years for this thank you for your patience. Reference: https://cwiki.apache.org/confluence/display/NIFI/Release+Notes#ReleaseNotes-Version1.16.0 You can find more about whats new in NiFi 1.16 in this video below. https://www.youtube.com/watch?v=8G6niPKntTc Mark also shows a bit of the new retry mechanism around 11:50
... View more
05-31-2022
05:48 AM
@dfdf as the error suggests: You need to install the mysql connector. I believe this link will get you there: https://docs.cloudera.com/csa/1.3.0/installation/topics/csa-ssb-configuring-mysql.html#ariaid-title3
... View more
05-24-2022
06:02 AM
1 Kudo
@FediMannoubi Below is a basic approach to solve. Assuming both postgres tables are populated with rows per your example, your nifi flow would need to get the CSV (various ways to do that), once the contents of the csv are in a flowfile (i use GenerateFlowFile processor), you can use a RecordReader based processor to read the csv. This will allow you to write SQL against the flowfile with QueryRecord to get a single value. For example: SELECT city_name FROM FLOWFILE Next, in your flow you will need to get the city_name value into an attribute, i use EvaluateJsonPath. After that a ExecuteSQL processor and associated DBCP Connection pool to postgres. Then in ExecuteSQL your query is SELECT city_id FROM CITY WHERE city_name=${city_name} At the end of this flow you will have the city_name from csv, and city_id from postgres. You can now combine or use the further downstream to suit your needs. INSERT is done similarly, once you have the data in flowfiles, or attributes, using the same ExecuteSQL you write an insert instead. My test flow looks like this, but forgive the end, as I did not actually have a postgres database setup. You can find this sample flow [here]. I hope this gets you pointed in the right direction for reading csv and querying data from database.
... View more
05-17-2022
12:35 PM
Nice one sir!
... View more
05-17-2022
12:34 PM
@joshtheflame CDP Private Cloud Base, for on prem, is able to be deployed on openshift kubernetes. CDP Public Cloud, in Aws, Azure, or GCP is fully kubernetes deployed in the respective cloud kubernetes platforms. CDP is Hybrid and Multi-Cloud capable as well. Check out CDP Private Cloud Base: https://docs.cloudera.com/data-warehouse/1.3.1/openshift-environments/topics/dw-private-cloud-openshift-environments-overview.html and CDP Public Cloud: https://docs.cloudera.com/cdp/latest/overview/topics/cdp-overview.html
... View more