Member since
07-29-2020
574
Posts
323
Kudos Received
176
Solutions
My Accepted Solutions
| Title | Views | Posted |
|---|---|---|
| 2115 | 12-20-2024 05:49 AM | |
| 2414 | 12-19-2024 08:33 PM | |
| 2161 | 12-19-2024 06:48 AM | |
| 1447 | 12-17-2024 12:56 PM | |
| 2062 | 12-16-2024 04:38 AM |
11-26-2022
05:49 PM
Hi, Not sure if what Im suggesting is the best solution so anyone who feels like there is better solution please advise. You can solve this in two different ways: 1- if you know the schema and you dont mind adding a header so you can use the QueryRecrod processor , then you can add header first (see: https://stackoverflow.com/questions/58707242/how-to-add-headers-to-a-csv-using-apache-nifi ) and then use three different queryrecrod processor for each value to query the different datasets. 2- if you dont want to add a header, then what you can do is the following: a. use SplitText processor to split each line b. do a RouteOnContent for each value (student, teacher, class...) to filter the records for each dataset. For example you will have RouteOnContent which will have dynamic property called "Student" and the search value "(Student)" where the Match Requirement property is set to "...contain match". This will filter only records that has Student in them. c. Use MergeContent processor to merge back the result set. Let me know if you have any questions. If you find this answers your question please accept solution. Thanks
... View more
11-25-2022
11:31 AM
1 Kudo
Hi @Green_ After farther investigation I found the reason the result is coming as blank is because we are missing the point from the processor description itself: "..The user must specify at least one Record Path, as a dynamic property, pointing to a field of type ARRAY containing RECORD objects..." Since the values in the array are just actual values and not a record its probably not working as expected. When I make the input looks like below , it works with the path specified: {
"Key1": [
{
"NestedKey1": "Value1",
"NestedKey2": [
{
"nestedValue": "Value2"
},
{
"nestedValue": "Value3"
},
{
"nestedValue": "Value4"
},
{
"nestedValue": "Value5"
}
],
"NestedKey3": "Value6"
}
]
} As suggestion - if that works with @Fredi - is to use Jolt transformation to convert the array into records as seen above and then use the Fork processor to achieve the desired result. The schema for the Json recrod writer can be as simple as the following : {
"type": "record",
"name": "TestObject",
"namespace": "ca.dataedu",
"fields": [{
"name": "NestedKey1",
"type": ["null", "string"],
"default": null
}, {
"name": "NestedKey3",
"type": ["null", "string"],
"default": null
}, {
"name": "nestedValue",
"type": ["null", "string"],
"default": null
}]
} Hope that helps. Thanks
... View more
11-25-2022
09:07 AM
Hi @Green_ @Fredi , I just tried it out of the box with the example mentioned in documentation: https://nifi.apache.org/docs/nifi-docs/components/org.apache.nifi/nifi-standard-nar/1.17.0/org.apache.nifi.processors.standard.ForkRecord/additionalDetails.html It seems to only produce correct result when you provide the schema for the record writer service. Not sure if that is related but when I tried without providing schema in the json record writer it gave me all the values as null!
... View more
11-25-2022
07:55 AM
1 Kudo
Hi, It seems like your best and out of the box option is to use the Fork Enrichment \ Join Enrichment processor. If each record in the original data has an equivalent record in the response then that should be super easy and you can use Wrapper or Insertion Strategy to enrich your data to something similar to what you expect, however since you mentioned that successful records wont have an equivalent record in the response ,then you can take advantage of the SQL Enrichment strategy where you can do sql join between the two input (original & response) using common ID and generate the desired out. In Sql Strategy you have a lot of flexibility but you have to be careful with the performance. The challenge here to use the SQL strategy or any Join Enrichment strategies is to have common link between the two, for your case the only link is the record index, so you might use Json Jolt to produce such link, For example your original data will look like this: { "data": [ { "index" : "Record 1", "id": "1234569", "Date": "2022-08-22" }, { "index" : "Record 2", "id": "1234567", "Date": "2022-08-22" }, ... and you response will look like this after jolt transformation : { "Index":"Record 2", "Record 2" : "Invalid values for emp info" }, { "Index":"Record 3" "Record 3" : "Invalid values for emp info" } ] In this case you can do SQL join enrichment using "index" as the join key link For more info regarding fork \join enrichment , see the following links: https://nifi.apache.org/docs/nifi-docs/components/org.apache.nifi/nifi-standard-nar/1.12.1/org.apache.nifi.processors.standard.JoinEnrichment/additionalDetails.html Please, if you find this is helpful please accept solution. Thanks
... View more
11-24-2022
03:33 PM
Hi Mohamed, I know the frustration. Its been a while honestly and I dont recall how did I resolve it, but for me I remember when I upgraded to 1.16 it took few times of uninstall\resinstall for it to work correctly. Can you please post what you have in your authorizer.xml and what is in the nifi.properties file regarding the security configuration -like I did above - . Also keep in mind the Initial User Identity is case sensitive so make sure that the one associated with the certificate files for the trust store and keystore and the one you define in the authorizer are the same letter case. Let me know. Thanks
... View more
11-10-2022
12:08 PM
Not sure how big is your Json and if its well formatted into multiple line. Make sure you have the Evaluation Mode is set to Line-by-Line , also you can increase the Maximum Buffer Size incase the text processed is greater than 1MB. Also what version of Nifi are you using ? there seems to be a bug around that as well where the flowfile will remain in the upstream queue and the overflow error is thrown: https://issues.apache.org/jira/browse/NIFI-10154
... View more
11-10-2022
11:26 AM
can you share the configuration for the ReplaceText Processor? Also how big is the jsonfile?
... View more
11-10-2022
08:14 AM
Hi, I tried the follow pattern on the sent file and it worked: [\x03]+
... View more
11-09-2022
01:09 PM
can you send me a sample json data with the error. The one you posted seems to be valid and Im able to split it .
... View more
11-09-2022
08:37 AM
Hi, I think you are having an issue because you have carriage return (\r\n) in the json , try using regex replace for the following as well: [\r\n] Hope that helps, if it does please accept solution. Thanks
... View more