Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

nifi: copy file from one source (Linux Remote source) to other(nifi and hadoop installed) server with use of scp command or alternative way (not site-to-site protocol )

avatar

Hey , I want to copy file from Linux Remote machine to hadoop with use of scp command

also i want to pass parameters (server ip , path etc) dynamically

Ex:

scp [options] username1@source_host:directory1/filename1 username2@destination_host:directory2/filename2

Here I will pass all the parameter pass dynamically every time through nifi API

how it is possible ...whether we need to create own processor ?

Thanks ,

Paresh

1 ACCEPTED SOLUTION

avatar
@Paresh Baldaniya

There are already processors designed to make systems calls.

ExecuteProcess -- Runs an operating system command specified by the user and writes the output of that command to a FlowFile.

ExecuteStreamCommand -- Executes an external command on the contents of a flow file, and creates a new flow file with the results of the command.

ExecuteScript -- Executes a script given the flow file and a process session.

I would think the ExecuteProcess would be the best for your use case.

View solution in original post

1 REPLY 1

avatar
@Paresh Baldaniya

There are already processors designed to make systems calls.

ExecuteProcess -- Runs an operating system command specified by the user and writes the output of that command to a FlowFile.

ExecuteStreamCommand -- Executes an external command on the contents of a flow file, and creates a new flow file with the results of the command.

ExecuteScript -- Executes a script given the flow file and a process session.

I would think the ExecuteProcess would be the best for your use case.