Member since
07-29-2020
201
Posts
45
Kudos Received
49
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
137 | 01-31-2023 10:16 AM | |
88 | 01-30-2023 02:34 PM | |
163 | 01-25-2023 12:42 PM | |
94 | 01-18-2023 12:26 PM | |
282 | 01-17-2023 10:41 AM |
05-18-2022
04:30 PM
Thanks Andre, I did manage to get the token using access/token api. however when I provided the token in postman as Bearer authentication I still get the 403 Forbidden response. Here is my request and response info as captured by Fiddler, let me know if you see anything wrong: PUT https://[server name]:9443/nifi-api/processors/385fcdc0-0180-1000-0000-000030a768e3/run-status HTTP/1.1 Content-Type: application/json Accept: application/json, text/javascript, */*; q=0.01 Keep-Alive: timeout=100, max=50000 Authorization: Bearer [access token] User-Agent: PostmanRuntime/7.29.0 Postman-Token: 5900c41a-f704-43f3-a2e4-a425eeb22569 Host: [host name]:9443 Accept-Encoding: gzip, deflate, br Connection: keep-alive Content-Length: 215 { "revision": { "clientId": "8F3BD748-DBCC-4703-8743-1D98A24B95C2", "version": 1.16, "lastModifier": "user.name" }, "state": "RUNNING", "disconnectedNodeAcknowledged": true } Response: HTTP/1.1 403 Forbidden X-Frame-Options: SAMEORIGIN Content-Security-Policy: frame-ancestors 'self' X-XSS-Protection: 1; mode=block X-Content-Type-Options: nosniff Strict-Transport-Security: max-age=31540000 Content-Length: 0 Server: Jetty(9.4.45.v20220203)
... View more
05-18-2022
09:04 AM
Hi, I have a secured Nifi cluster. Im trying to call a nifi api to start\stop processor using postman. I followed the instruction for the api "PUT /processors/{id}/run-status". Provided the Bearer token and the Json Body. However I keep getting 403 Forbidden message. Does anybody know why? I'm able to run other APIs successfully such as getting processor info "GET /processors/{id}"! Im guessing its because Im using SSL secured nifi with jks keystore and truststore, but not sure how to provide this information to postman. Can anyone help please?
... View more
Labels:
- Labels:
-
Apache NiFi
05-12-2022
01:31 PM
1 Kudo
I never worked with Hive DB, but if it only takes avro format and you have csv you can use processor ConvertRecrod to convert a record from one format to another. In this processor you create CSV Reader service and the writer is Avro
... View more
05-10-2022
01:15 PM
1 Kudo
There is a lot of articles\videos around learning Regular Expression. The one I provided using what is called positive lookbehind (?<=), which means: find the Id which contain alphanumeric characters with hyphen (-) that is proceeded (lookbehind) with the text "requestUid"
... View more
05-10-2022
12:31 PM
1 Kudo
Not sure if this is the the most efficient way, but you can use ExtractText by creating new attribute with the following regular expression: (?<=\"requestUid\":\s")[A-Z-a-z-0-9\-]+ Not sure if you can accomplish that using evaluatejsonPath processor, but given that the path can be different its probably hard to accomplish
... View more
05-10-2022
12:28 PM
there are a lot of processors that can help you with that for example : PUTSQL, PutDatabaseRecord, ExecuteSQL
... View more
05-06-2022
10:20 AM
have you tried Selected distinct * order by Date, ERROR_MSG
... View more
05-06-2022
06:11 AM
It should not matter if the format of the text inside is like csv.
... View more
05-05-2022
03:06 PM
you can do that using ExtractText Processor to extract the line that you need with word "bind" into an attribute using regular expression "( bind.+$) ", then you ReplaceText processor to basically located the same line using the same regular expression and the the replacement value is the attribute from the ExtractText processor (assume its called XYZ) like this ${XYZ:replace(" ","")}
... View more
05-05-2022
12:57 PM
1 Kudo
Not sure if this is the best way but try using QueryRecord and create one dynamic property to funnel the data into where your query will look like this: Select distinct * from FLOWFILE
... View more
05-05-2022
12:48 PM
1 Kudo
if this is a csv file where the first line is the header, you can easily split the source into two flowfiles: one containing all keyword1 rows and another containing all keyword2 rows using QueryRecord Processor. After you set you record writer\reader to CSV you can create two dynamic properties representing each keyword, and set the query as follows: Keyword1: SELECT * FROM FLOWFILE WHERE KeyWord like 'KeyWord1' Keyword2: SELECT * FROM FLOWFILE WHERE KeyWord like 'KeyWord2'
... View more
04-08-2022
05:40 AM
Hi, I have downloaded version 1.16 which is the latest. Im trying to secure nifi with TLS and LDAP. However I keep getting the followning message : Unknown user with identity 'CN=nifi_admin, OU=NIFI'. Contact the system administrator My Nifi,properties has the following set : nifi.security.user.authorizer=managed-authorizer nifi.security.user.login.identity.provider=ldap-provider If I have set as follows it works and it accepts the cert & authentication: nifi.security.user.authorizer=single-user-authorizer nifi.security.user.login.identity.provider=single-user-provider My Authorizer file has the identity set as follows: <userGroupProvider> ...<property name="Initial User Identity 1">CN=nifi_admin, OU=NIFI</property> </userGroupProvider> <accessPolicyProvider> ... <property name="Initial Admin Identity">CN=nifi_admin, OU=NIFI</property> <property name="Legacy Authorized Users File"></property> <property name="Node Identity 1"></property> ... </accessPolicyProvider> Im trying to log in first with the cert idenitity nifi_admin so I can start adding ldap users. If I log it as single user I dont see Users & Policies menu items. Can someone help point me in the right direction.
... View more
- Tags:
- authorization
- NiFi
Labels:
- Labels:
-
Apache NiFi
02-11-2022
11:10 AM
Thank you so much, That did the trick. You seem to be a json jolt guru. Im trying to understand it myself but its not that straight forward. Can you just elaborate this syntax for me which remove null values on the base level: "*": { "*": { "@1": "&2" } } Why there is two asterisks in front of it?
... View more
02-10-2022
10:16 AM
Hi,
I have the following Json. I would like to create a Jolt Spec to remove all nulls from all levels. How can I do that? Any help would be appreciated. Thanks
{ "ID": "c22b657e-227b-4c65-e6f8-08d88a3e1118", "Name": "SomeName", "TaggedItemName": "SomeValue", "ITRName": "SomeValue", "TestReference": null, "JobCardName": null, "Comments": "test", "PrimaryHandoverName": null, "SecondaryHandoverNumber": null, "CertificationGroupingName": null, "DocumentCode": "SomeValue", "AssignedToName": "SomeName", "DocumentReference": null, "TagITRCompletionStatusName": "Rejected", "ScheduleRevision": null, "Completed": { "SignOff_AuthP": "John", "SignOff_Date": null }, "Accepted": { "SignOff_AuthP": "Smith", "SignOff_Date": null }, "Approved": { "SignOff_AuthP": "Ali", "SignOff_Date": null }, "ScheduleGroup": null, "DownloadUri": "SomeUrl", "UpdatedDate": "2022-01-18T11:36:33.9057626Z", "CreatedDate": "2020-11-30T10:57:20.8534287Z" }
... View more
Labels:
- Labels:
-
Apache NiFi
11-04-2021
05:59 AM
Hi, Best way I can think of store that attribute name in another attribute. You can also use execute script processor and read the attribute name you are looking for as in : var myAttr = flowFile.getAttribute('filename') and if the value is not null then you can route the flow File to the desired relationship. For more information on how to use the execute script processor: https://community.cloudera.com/t5/Community-Articles/ExecuteScript-Cookbook-part-1/ta-p/248922
... View more
10-14-2021
02:32 PM
can you share the jolt script so I can take a look?
... View more
10-13-2021
03:30 AM
Hi, Have you tried using JsonJolt processor. Its like json transformation from one format to another. Im not Json jolt expert myself but I found a link that might help you in what you are trying to do: https://www.titanwolf.org/Network/q/1600720e-fcfb-4711-bce6-4cca97006d16/y
... View more
10-07-2021
05:50 PM
Matt, Thanks for your reply. Basically I have two local nifi's: one is that 1.11 and another is 1.14 both of them are setup almost identical in the nifi.properties, however 1.11 is working when passing the azure workspace key and Id and the other 1.14 throwing the error. the log file doesnt have much info besides the message I posted above but here is the stack trace from the log file if that helps in anyway: java.lang.RuntimeException: HTTP/1.1 403 Forbidden at org.apache.nifi.reporting.azure.loganalytics.AbstractAzureLogAnalyticsReportingTask.postRequest(AbstractAzureLogAnalyticsReportingTask.java:164) at org.apache.nifi.reporting.azure.loganalytics.AbstractAzureLogAnalyticsReportingTask.sendToLogAnalytics(AbstractAzureLogAnalyticsReportingTask.java:147) at org.apache.nifi.reporting.azure.loganalytics.AzureLogAnalyticsReportingTask.sendMetrics(AzureLogAnalyticsReportingTask.java:137) at org.apache.nifi.reporting.azure.loganalytics.AzureLogAnalyticsReportingTask.onTrigger(AzureLogAnalyticsReportingTask.java:107) at org.apache.nifi.controller.tasks.ReportingTaskWrapper.run(ReportingTaskWrapper.java:44) at org.apache.nifi.engine.FlowEngine$2.run(FlowEngine.java:110) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) Im hoping someone can try the same on their end to verify if they are getting the same results
... View more
10-07-2021
10:34 AM
Once it brings it it wont bring again because it will save its timestamp and then use that to get newer files added and so on.
... View more
10-07-2021
09:46 AM
You really dont need a screenshot because you are not changing much properties: 1- Create ListFile Processor & set the "Input Directory" to whatever directory you want to track. 2- Create a FetchFile Processor and connect the ListFile to it via the "success" relationship. under the processor properties keep the "File to Fetch" property set to "${absolute.path}/${filename}" since the path and the file name will be set in those attributes using the ListFile and that is it. After that the content of the file will be passed via the success relation and you can do whatever you want with it just as if you are using GetFile except the ListFile will keep state of the latest file timestamp it grabbed and basically use that to grab any new files added to the folder and update the state to new timestamp and so.
... View more
10-07-2021
09:32 AM
I think I know what is the problem. Since you are not doing any fragmentation -like in using split processor- try setting the Support Fragmented Transaction to False in the PUTSQL. Also remove the original \ failure connection from the Convert processor. you can mute those relationship in the convert setting table by checking the box.
... View more
10-06-2021
01:44 PM
Hi, You seem to be having the correct processors and sequence. Can you share screenshot with the configuration of the put SQL. Have you configured the PUTSQL with the correct DB connection? Also are you seeing data (FlowFiles) flowing into it from the above processors. It could be that you are not getting any data to begin with or your configuration in the putSQL is incorrect. By the way why are you connecting failuer,original, sql relationship from the ConvertJsonTOSQL to the PUT SQL? you will get an error this way, you need to just connect the sql releationship. Another thing , the SQL property in the PUTSQL needs to be empty so that it process the sql received from the convert processor, if you have anything in there it will not take the sql insert from the convert processor
... View more
10-06-2021
01:27 PM
Take a look at the Nifi ListFile & Fetch File processors. They both work together. The ListFile will read files metadata based on the last read file modified date and will keep state of that so that only newly added files will be read. The fetch file will take the filename parameter from the ListFile processor and fetch the contents. Hope that helps
... View more
10-04-2021
04:58 PM
Can some one help verify this please. Im thinking that there is bug in the 1.14 version around this, because I have tried the same ReportingTask on version 1.11.4 against the same Azure Log Analytics workspace ID and Key and its working, so that tells me its not something with Azure. Both versions have the same settings when it comes to nifi.web.https and nifi.remote.input properties except for nifi.remote.input.socket.port where its set for different ports. Thank you.
... View more
10-01-2021
07:03 PM
Hi, Im having strange situation and I would really appreciate your help. Basically I was able to setup AzureLogAnalyticsReportingTask (Nifi 1.14) on a cluster and it was working after providing the Workspace ID and Key so I was able to see the log under Azure CustomLog and create my queries and dashboard, however after couple of days it started generating the following error: AzureLogAnalyticsReportingTask [id=] Failed to publish metrics to Azure Log Analytics: java.lang.RuntimeException: HTTP/1.1 403 Forbidden Not sure what happened. I re enter the workspace ID and Key but that did not help. Cam you please help. Thanks
... View more
Labels:
- Labels:
-
Apache NiFi
09-27-2021
08:02 PM
I think the problem is that you are trying to evaluate path on a list of sensors: sensor":[{"sensor_id":"Data05_37_alarm","value ":"0"},{"sensor_id":"Data06_37_alarm","value":"0"}.... you probably need to flatten json first and if you are trying to get each sensor information then you need to do splitjson and then you can do evaluatejsonpath. Hope that helps.
... View more
- Tags:
- NiFi
09-25-2021
03:57 PM
Its Solved after upgrading to 1.14.0
... View more
09-25-2021
03:55 PM
Its Solved in the latest version 1.14.0. I'm not sure if they addressed this issue in later releases of 1.13 version
... View more
07-16-2021
11:33 AM
This appears to be fixed in the latest release 1.14.0. I will mark this as resolved. thanks
... View more
06-20-2021
11:43 AM
Hi, I have downloaded the latest version of Nifi 1.13.2. In one of my flows Im using an Execute script processor that utilizes python engine which I know it was working without problems in an older (version 1.11.4). I noticed when the processor executes for the first time it works then if you stop it and re run again it hangs and the flow files get stuck in the upstream queue. Initially I thought something in my script is causing the issue so I change it to something as simple as this flowFile = session.get() if flowFile != None: # All processing code starts at this indent session.transfer(flowFile, REL_SUCCESS) Even with this simple script the processor hangs and the flow files remain in the queue. This is a major issue and its causing me not to upgrade. Not sure if you are aware of it and if if there is anything that can be done to mitigate. please advise.
... View more
Labels:
- Labels:
-
Apache NiFi
- « Previous
- Next »