Member since
07-29-2020
574
Posts
323
Kudos Received
176
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
1957 | 12-20-2024 05:49 AM | |
2185 | 12-19-2024 08:33 PM | |
2003 | 12-19-2024 06:48 AM | |
1324 | 12-17-2024 12:56 PM | |
1854 | 12-16-2024 04:38 AM |
10-13-2021
03:30 AM
Hi, Have you tried using JsonJolt processor. Its like json transformation from one format to another. Im not Json jolt expert myself but I found a link that might help you in what you are trying to do: https://www.titanwolf.org/Network/q/1600720e-fcfb-4711-bce6-4cca97006d16/y
... View more
10-07-2021
05:50 PM
Matt, Thanks for your reply. Basically I have two local nifi's: one is that 1.11 and another is 1.14 both of them are setup almost identical in the nifi.properties, however 1.11 is working when passing the azure workspace key and Id and the other 1.14 throwing the error. the log file doesnt have much info besides the message I posted above but here is the stack trace from the log file if that helps in anyway: java.lang.RuntimeException: HTTP/1.1 403 Forbidden at org.apache.nifi.reporting.azure.loganalytics.AbstractAzureLogAnalyticsReportingTask.postRequest(AbstractAzureLogAnalyticsReportingTask.java:164) at org.apache.nifi.reporting.azure.loganalytics.AbstractAzureLogAnalyticsReportingTask.sendToLogAnalytics(AbstractAzureLogAnalyticsReportingTask.java:147) at org.apache.nifi.reporting.azure.loganalytics.AzureLogAnalyticsReportingTask.sendMetrics(AzureLogAnalyticsReportingTask.java:137) at org.apache.nifi.reporting.azure.loganalytics.AzureLogAnalyticsReportingTask.onTrigger(AzureLogAnalyticsReportingTask.java:107) at org.apache.nifi.controller.tasks.ReportingTaskWrapper.run(ReportingTaskWrapper.java:44) at org.apache.nifi.engine.FlowEngine$2.run(FlowEngine.java:110) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) Im hoping someone can try the same on their end to verify if they are getting the same results
... View more
10-07-2021
10:34 AM
Once it brings it it wont bring again because it will save its timestamp and then use that to get newer files added and so on.
... View more
10-07-2021
09:46 AM
You really dont need a screenshot because you are not changing much properties: 1- Create ListFile Processor & set the "Input Directory" to whatever directory you want to track. 2- Create a FetchFile Processor and connect the ListFile to it via the "success" relationship. under the processor properties keep the "File to Fetch" property set to "${absolute.path}/${filename}" since the path and the file name will be set in those attributes using the ListFile and that is it. After that the content of the file will be passed via the success relation and you can do whatever you want with it just as if you are using GetFile except the ListFile will keep state of the latest file timestamp it grabbed and basically use that to grab any new files added to the folder and update the state to new timestamp and so.
... View more
10-07-2021
09:32 AM
I think I know what is the problem. Since you are not doing any fragmentation -like in using split processor- try setting the Support Fragmented Transaction to False in the PUTSQL. Also remove the original \ failure connection from the Convert processor. you can mute those relationship in the convert setting table by checking the box.
... View more
10-06-2021
01:44 PM
Hi, You seem to be having the correct processors and sequence. Can you share screenshot with the configuration of the put SQL. Have you configured the PUTSQL with the correct DB connection? Also are you seeing data (FlowFiles) flowing into it from the above processors. It could be that you are not getting any data to begin with or your configuration in the putSQL is incorrect. By the way why are you connecting failuer,original, sql relationship from the ConvertJsonTOSQL to the PUT SQL? you will get an error this way, you need to just connect the sql releationship. Another thing , the SQL property in the PUTSQL needs to be empty so that it process the sql received from the convert processor, if you have anything in there it will not take the sql insert from the convert processor
... View more
10-06-2021
01:27 PM
Take a look at the Nifi ListFile & Fetch File processors. They both work together. The ListFile will read files metadata based on the last read file modified date and will keep state of that so that only newly added files will be read. The fetch file will take the filename parameter from the ListFile processor and fetch the contents. Hope that helps
... View more
10-04-2021
04:58 PM
Can some one help verify this please. Im thinking that there is bug in the 1.14 version around this, because I have tried the same ReportingTask on version 1.11.4 against the same Azure Log Analytics workspace ID and Key and its working, so that tells me its not something with Azure. Both versions have the same settings when it comes to nifi.web.https and nifi.remote.input properties except for nifi.remote.input.socket.port where its set for different ports. Thank you.
... View more
10-01-2021
07:03 PM
Hi, Im having strange situation and I would really appreciate your help. Basically I was able to setup AzureLogAnalyticsReportingTask (Nifi 1.14) on a cluster and it was working after providing the Workspace ID and Key so I was able to see the log under Azure CustomLog and create my queries and dashboard, however after couple of days it started generating the following error: AzureLogAnalyticsReportingTask [id=] Failed to publish metrics to Azure Log Analytics: java.lang.RuntimeException: HTTP/1.1 403 Forbidden Not sure what happened. I re enter the workspace ID and Key but that did not help. Cam you please help. Thanks
... View more
Labels:
- Labels:
-
Apache NiFi
09-27-2021
08:02 PM
I think the problem is that you are trying to evaluate path on a list of sensors: sensor":[{"sensor_id":"Data05_37_alarm","value ":"0"},{"sensor_id":"Data06_37_alarm","value":"0"}.... you probably need to flatten json first and if you are trying to get each sensor information then you need to do splitjson and then you can do evaluatejsonpath. Hope that helps.
... View more