Support Questions
Find answers, ask questions, and share your expertise

Nifi for reading json data and extract source and destination paths and drop the file to destination ?


S3bucket path of file (source) and destination path(local file share) is sent from a custom java code. Nifi has to convert this into JSON format and extract the file from S3 bucket and place it in local file share.

Can somebody share the thoughts on this and give me a idea how to implement this ?


Super Guru

If your custom code can send flow files with attributes containing the source and destination information, you can use FetchS3Object to get the file from S3, then PutFile to put it in a local file share. If your custom code does not use the NiFi API, then consider ExecuteScript with Groovy (specifying your JARs in the Module Directory property) and calling the code from there, or perhaps even ExecuteStreamCommand if you want to (or must) call it from the command line. For the former option, I discuss how to use modules in code in part 3 of my ExecuteScript Cookbook series (and the other parts have related examples).


Thanks for response.

Small correction.. From custom code the message is gonna publish to kafka queue from there I am picking the JSON message to pass to EvaluateJsonPath processor. the EvaluateJsonPath has now two values one is source path and one is destination path.

As you said you can use FetchS3Object to get the file from S3, how should I pass the source path to FetchS3Object processor and then how should I pass the destination path to PutFile processor?

Could you explain me briefly ?

Right now my flow is like attached screen shot.



Take a Tour of the Community
Don't have an account?
Your experience may be limited. Sign in to explore more.