Member since
07-29-2020
574
Posts
320
Kudos Received
175
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
240 | 12-20-2024 05:49 AM | |
274 | 12-19-2024 08:33 PM | |
286 | 12-19-2024 06:48 AM | |
237 | 12-17-2024 12:56 PM | |
223 | 12-16-2024 04:38 AM |
05-24-2022
06:02 AM
1 Kudo
@FediMannoubi Below is a basic approach to solve. Assuming both postgres tables are populated with rows per your example, your nifi flow would need to get the CSV (various ways to do that), once the contents of the csv are in a flowfile (i use GenerateFlowFile processor), you can use a RecordReader based processor to read the csv. This will allow you to write SQL against the flowfile with QueryRecord to get a single value. For example: SELECT city_name FROM FLOWFILE Next, in your flow you will need to get the city_name value into an attribute, i use EvaluateJsonPath. After that a ExecuteSQL processor and associated DBCP Connection pool to postgres. Then in ExecuteSQL your query is SELECT city_id FROM CITY WHERE city_name=${city_name} At the end of this flow you will have the city_name from csv, and city_id from postgres. You can now combine or use the further downstream to suit your needs. INSERT is done similarly, once you have the data in flowfiles, or attributes, using the same ExecuteSQL you write an insert instead. My test flow looks like this, but forgive the end, as I did not actually have a postgres database setup. You can find this sample flow [here]. I hope this gets you pointed in the right direction for reading csv and querying data from database.
... View more
05-23-2022
07:11 AM
Yep! That is absolutely correct. Thanks for that. I spent more time trying to resolve this than I care to admit, because of my bias. Thanks for this @SAMSAL
... View more
05-12-2022
09:28 PM
Sure SAMSOL, In Hive DB, I am having a table and no need for any SQL INSERT STATEMENT. Say HIVE DB Table has XXX, in that I have to put these Data and Error Field, Just convert to Avro works or ??? have to call insert statement processor
... View more
05-06-2022
06:11 AM
It should not matter if the format of the text inside is like csv.
... View more
05-05-2022
03:06 PM
you can do that using ExtractText Processor to extract the line that you need with word "bind" into an attribute using regular expression "(bind.+$)", then you ReplaceText processor to basically located the same line using the same regular expression and the the replacement value is the attribute from the ExtractText processor (assume its called XYZ) like this ${XYZ:replace(" ","")}
... View more
02-11-2022
01:51 PM
1 Kudo
Great to hear! I try my best to understand Jolt because sometimes it can be quite useful, but I think it has a very convoluted syntax and sometimes it's really hard to use. But practice helps. The first asterisk matches against the field names of an object. The second asterisk depends: if the value of the attribute is a scalar, it will match against the value; if it's a nest object, it will match against the name of the nested object. The trick is that when it matches the value of the object it does not match nulls 😉 Cheers, André
... View more
11-04-2021
05:59 AM
Hi, Best way I can think of store that attribute name in another attribute. You can also use execute script processor and read the attribute name you are looking for as in : var myAttr = flowFile.getAttribute('filename') and if the value is not null then you can route the flow File to the desired relationship. For more information on how to use the execute script processor: https://community.cloudera.com/t5/Community-Articles/ExecuteScript-Cookbook-part-1/ta-p/248922
... View more
10-14-2021
02:32 PM
can you share the jolt script so I can take a look?
... View more
10-07-2021
05:50 PM
Matt, Thanks for your reply. Basically I have two local nifi's: one is that 1.11 and another is 1.14 both of them are setup almost identical in the nifi.properties, however 1.11 is working when passing the azure workspace key and Id and the other 1.14 throwing the error. the log file doesnt have much info besides the message I posted above but here is the stack trace from the log file if that helps in anyway: java.lang.RuntimeException: HTTP/1.1 403 Forbidden at org.apache.nifi.reporting.azure.loganalytics.AbstractAzureLogAnalyticsReportingTask.postRequest(AbstractAzureLogAnalyticsReportingTask.java:164) at org.apache.nifi.reporting.azure.loganalytics.AbstractAzureLogAnalyticsReportingTask.sendToLogAnalytics(AbstractAzureLogAnalyticsReportingTask.java:147) at org.apache.nifi.reporting.azure.loganalytics.AzureLogAnalyticsReportingTask.sendMetrics(AzureLogAnalyticsReportingTask.java:137) at org.apache.nifi.reporting.azure.loganalytics.AzureLogAnalyticsReportingTask.onTrigger(AzureLogAnalyticsReportingTask.java:107) at org.apache.nifi.controller.tasks.ReportingTaskWrapper.run(ReportingTaskWrapper.java:44) at org.apache.nifi.engine.FlowEngine$2.run(FlowEngine.java:110) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) Im hoping someone can try the same on their end to verify if they are getting the same results
... View more
- « Previous
- Next »