Member since
07-29-2020
574
Posts
323
Kudos Received
176
Solutions
My Accepted Solutions
| Title | Views | Posted |
|---|---|---|
| 2146 | 12-20-2024 05:49 AM | |
| 2444 | 12-19-2024 08:33 PM | |
| 2192 | 12-19-2024 06:48 AM | |
| 1459 | 12-17-2024 12:56 PM | |
| 2087 | 12-16-2024 04:38 AM |
03-04-2023
09:24 AM
Hi, Please try the following jolt spec: [
{
"operation": "shift",
"spec": {
"rows": {
"*": {
"f": {
"0": {
"v": "[#4].export_time"
},
"1": {
"v": "[#4].account_id"
},
"2": {
"v": "[#4].cost"
}
}
}
}
}
}
] If that helps please accept solution. Thanks
... View more
02-27-2023
06:29 AM
1 Kudo
Hi, I dont think the EvaluateJsonPath and RouteOnAttribute will work properly since you have an array of json records in the the incoming flowfiles. You have to split the array using SplitJson processor and then use the EvaluateJsonPath->RouteOnAttributes to get the expected result. However there is better and more efficient option than using the mentioned processors (Split->Evaluate->Route), you can use just QueryRecords to filter the array and isolate the needed records. You can see example on how to use QueryRecords here: https://nifi.apache.org/docs/nifi-docs/components/org.apache.nifi/nifi-standard-nar/1.19.1/org.apache.nifi.processors.standard.QueryRecord/additionalDetails.html If that helps , please accept solution.
... View more
02-19-2023
08:16 AM
2 Kudos
Hi, Try to look into QueryRecord or PartitionRecord Processors. Those might help. Thanks
... View more
02-17-2023
07:08 AM
Hi, thanks for the information , I came across situation before and I'm not sure if there is better way but you can use the following processors after you split to data into individual records : ExtractText -> PUTSQL For the ExtractText: 1- Add dynamic Property to capture all the Json Content from incoming flowfile as follows: Note: You have to be careful with if each record data can be large (> 1024 Chars). In this case you need to look into modifying "Maximum Buffer Size" and "Maximum Capture Group Length" accordingly otherwise the data will be truncated. For the PutSQL: Once you configure the JDBC Connection Pool , you can set the SQL Statement Property with something like this: insert into myTable (jsonCol) values ('${JsonRecord}') If that helps please accept solution
... View more
02-16-2023
09:36 AM
Hi , Can you provide sample\example data of the json and how do you expect to save it as ?
... View more
02-03-2023
11:34 AM
hmmm, Are you sure the inbokehttp is getting triggered? how is it getting triggered ? did you try to run it manually using run-once ? Also can you show how the api is getting call using curl (you can hide the full url)?
... View more
02-03-2023
08:48 AM
Hi, I think this can be achieved using the following flow: (Data Source) -> SplitJson ->JoltTransformationJSON ->ConcertJSONToSQL->PutSQL The Configuration for each processor is as follows: 1- SplitJson (To get Each of the Rows Element as flowfile) JsonPath Expression = $.Rows 2-JoltTransformationJSON (Convert Each Row Values Element to proper format): Example Input: { "Values": [ "A4", "Test Data A4"] } Output: { "VALUE" : "A4", "DESCRIPTION" : "Test Data A4" } Jolt Spec: [
{
"operation": "shift",
"spec": {
"Values": {
"0": "VALUE",
"1": "DESCRIPTION"
}
}
}
] Note: The Output Json Keys has to match the column name in the SQL table 3- ConverJSONToSQL ( This is to convert the jolt output json to sql statement that will feed into the PutSql Note: you need to Populate the JDBC Connection Pool 4- PutSql (used to execute the insert sql statement generated from above into the DB Table. Note: You need to create the JDBC Connection Pool. SQL Statement Property remains empty to use the SQL statement from the flowfile generated in step 3. If that helps please accept solution. Thanks
... View more
02-03-2023
07:33 AM
1 Kudo
Hi, Few pointers that might help: - I found that when the "Always Output Response" is set to False you are not going to get a response flowfile specially when there are errors, so make sure to set it to True to get response all the time. - My understanding is that you are trying to send Query Parameters with param name "data". If that is correct, then what you specified is not going to work because dynamic property and "Attributes to Send" is for API header info and not query param. To send the query param you need to include it as part of the "Remote Url" expression: #{ADSB_url}/?data=${data} I suggest you first try to the exact url in the Remote url value to make sure its working as expected , then work your way back to using Parameter Context and Attributes. Hope that helps.
... View more
02-01-2023
07:42 PM
Hi, Can you provide sample data with the expected output? You have to be careful with the wrapper and insertion strategies as the data is matched based on the record index from both location. Which means that data has to be ordered correctly in case you are processing multiple records at a time.
... View more
02-01-2023
10:26 AM
Hi, You can do that in two ways: 1- Calling P2 from P1 using Output\Input Ports to connect process groups 2- Using Nifi API using nifi HttpInvoke Processor from P1 to control the run-state of the Initial Processor in P2. Hope that help. Thanks
... View more