Created 04-27-2022 12:29 AM
Hi, I have a flow that receives JSON arrays as input. I would like to validate each of these jsons' schema, however using the ValidateRecord processor doesn't quite seem to do the job. I need to validate things such as certain fields being Enum values, having a max/min length, and ensuring required fields are present (sometimes inside of optional nested jsons).
It seems an avro schema does not allow some of these functionalities and as such the Record processors can't quite validate my data as I need it.
I would love to hear if anyone has had a similar use case and what they did to solve it. I am considering using the ScriptedValidateRecord processor, however I would prefer to avoid that and might instead opt for using EvaluateJsonPath to extract all the fields I want to validate and then using RouteOnAttribute with the expression language to filter out bad records. If there is a more appropriate way to validate records like this then I'm all ears.
Thanks I'm advance!
Created 10-21-2022 07:42 AM
I also am having the same issue in needing to validate json against a json schema. Is there any way this can be done without installing schema packages for groovy or python to use in the ExecuteScript processor? Or should my team just use XML files since there is a validateXML processor?
Created 10-21-2022 07:54 AM
In the end, I never found a perfect solution and just opted to use attributes and RouteOnAttribute. If you find ValidateXML can actually verify all the checks you need, it shouldn't be too bad using a ConvertRecord processor and transforming JSON to XML for the validation (or perhaps just straight up using XMLs instead of jsons if that fits your use case)
Created 10-21-2022 10:09 AM
Ok thanks. I figured as much.
Created 11-19-2022 01:35 PM
I am bumping this question in hopes someone might know of a better solution