S3bucket path of file (source) and destination path(local file share) is sent from a custom java code. Nifi has to convert this into JSON format and extract the file from S3 bucket and place it in local file share.
Can somebody share the thoughts on this and give me a idea how to implement this ?
If your custom code can send flow files with attributes containing the source and destination information, you can use FetchS3Object to get the file from S3, then PutFile to put it in a local file share. If your custom code does not use the NiFi API, then consider ExecuteScript with Groovy (specifying your JARs in the Module Directory property) and calling the code from there, or perhaps even ExecuteStreamCommand if you want to (or must) call it from the command line. For the former option, I discuss how to use modules in code in part 3 of my ExecuteScript Cookbook series (and the other parts have related examples).
Thanks for response.
Small correction.. From custom code the message is gonna publish to kafka queue from there I am picking the JSON message to pass to EvaluateJsonPath processor. the EvaluateJsonPath has now two values one is source path and one is destination path.
Could you explain me briefly ?
Right now my flow is like attached screen shot.