I am here to ask your opinion about following subject:
I am planning to use Apache Ni-Fi to orchestrate my dataflow. I have used it, so its familiar somehow.
Now, if I have a process, which needs GPU and that GPU is located in different linux machine, what would be good way to give commands to that GPU machine to start processing files from dataflow ?
My first processor pulls files from FTP. Next one does some normalization to them and this third processor is that GPU machine phase.
Those files end up in that GPU machines and nothing but text is outputted finally. That outputted text should be push to SQL
Question: How that GPU machine should be activated to execute files from Ni-Fi flow .. ?
Thanks for your answers.
You can create a shell script(ssh into) with all your commands that you wish to run on GPU machine and trigger the shell script using NiFi( Execute Script,ExecuteStreamCommand..etc ) processors.
Thanks for your answer ! I tought that execute ssh - approach. But what about if I install ni-fi in that GPU machine and make remote process group there, and let that handle output forwarding back to that nifi instance which called that remote process group and get better statistics also from input/output ?
Does this sound too heavy solution, or more like NiFi way ? ..
I can push GPU machine generated output text to SQL server and take it from there .. (SSH execution) Just would like to get some control what is happening in that GPU machine during execution, and that SSH way kind of looses track what is happening (vs. Remote process group .. ) or am I misunderstood ?