Support Questions
Find answers, ask questions, and share your expertise

How to stop a processor after the pyspark job has been completed

Rising Star

Hi,

I'm using a pyspark processor in NIFI 1.6.0 to execute a script and storing the results on the local. I can see the operation is being performed but how do I let my processor know to stop if it has finished the execution of the script. As it keeps on streaming This concerns me more because once the files get moved it starts throwing errors in the logs which are not tempting at all.

@Shu Suggestions from you would be helpful as well

1 REPLY 1

Super Guru

By "a pyspark processor" do you mean ExecuteSparkInteractive?