Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

How to stop a processor after the pyspark job has been completed

How to stop a processor after the pyspark job has been completed

Rising Star

Hi,

I'm using a pyspark processor in NIFI 1.6.0 to execute a script and storing the results on the local. I can see the operation is being performed but how do I let my processor know to stop if it has finished the execution of the script. As it keeps on streaming This concerns me more because once the files get moved it starts throwing errors in the logs which are not tempting at all.

@Shu Suggestions from you would be helpful as well

1 REPLY 1
Highlighted

Re: How to stop a processor after the pyspark job has been completed

Super Guru

By "a pyspark processor" do you mean ExecuteSparkInteractive?

Don't have an account?
Coming from Hortonworks? Activate your account here