Created 01-04-2018 05:47 AM
Hey , I want to copy file from Linux Remote machine to hadoop with use of scp command
also i want to pass parameters (server ip , path etc) dynamically
Ex:
scp [options] username1@source_host:directory1/filename1 username2@destination_host:directory2/filename2
Here I will pass all the parameter pass dynamically every time through nifi API
how it is possible ...whether we need to create own processor ?
Thanks ,
Paresh
Created 01-09-2018 09:21 PM
There are already processors designed to make systems calls.
ExecuteProcess -- Runs an operating system command specified by the user and writes the output of that command to a FlowFile.
ExecuteStreamCommand -- Executes an external command on the contents of a flow file, and creates a new flow file with the results of the command.
ExecuteScript -- Executes a script given the flow file and a process session.
I would think the ExecuteProcess would be the best for your use case.
Created 01-09-2018 09:21 PM
There are already processors designed to make systems calls.
ExecuteProcess -- Runs an operating system command specified by the user and writes the output of that command to a FlowFile.
ExecuteStreamCommand -- Executes an external command on the contents of a flow file, and creates a new flow file with the results of the command.
ExecuteScript -- Executes a script given the flow file and a process session.
I would think the ExecuteProcess would be the best for your use case.