Created 06-12-2018 07:41 PM
S3bucket path of file (source) and destination path(local file share) is sent from a custom java code. Nifi has to convert this into JSON format and extract the file from S3 bucket and place it in local file share.
Can somebody share the thoughts on this and give me a idea how to implement this ?
Created 06-13-2018 01:56 AM
If your custom code can send flow files with attributes containing the source and destination information, you can use FetchS3Object to get the file from S3, then PutFile to put it in a local file share. If your custom code does not use the NiFi API, then consider ExecuteScript with Groovy (specifying your JARs in the Module Directory property) and calling the code from there, or perhaps even ExecuteStreamCommand if you want to (or must) call it from the command line. For the former option, I discuss how to use modules in code in part 3 of my ExecuteScript Cookbook series (and the other parts have related examples).
Created on 06-13-2018 02:41 PM - edited 08-17-2019 07:20 PM
Thanks for response.
Small correction.. From custom code the message is gonna publish to kafka queue from there I am picking the JSON message to pass to EvaluateJsonPath processor. the EvaluateJsonPath has now two values one is source path and one is destination path.
As you said you can use FetchS3Object to get the file from S3, how should I pass the source path to FetchS3Object processor and then how should I pass the destination path to PutFile processor?
Could you explain me briefly ?
Right now my flow is like attached screen shot.
PFA...