Support Questions

Find answers, ask questions, and share your expertise

Scheduling ExecuteStreamCommand processor with GenerateFlowFile to run every minute

avatar
Contributor

I want to use ExecuteStreamCommand to submit a spark job via the shell, and I want to use GenerateFlowFile so that I can detect the spark job failure and RouteOnAttribute as suggested by Matt's answer here.

I think it worked for detecting failure, but I can't make it scheduled correctly.

If i want the whole flow (generation of the flow file, the ExecuteStreamCommand and Routng) to be executed every 1 minute, should I schedule the GenerateFlowFile every 1 minute and leave the ExecuteStreamCommand as default (0 schedule) or should I schedule both.

I tried different combinations but it didn't work properly, I think the GenerateFlowFile keeps generating flow files but the ExecuteStreamCommand don't run multiple times.


another problem is that when I stop the ExecuteStreamCommand processor, it gets stuck, I can't change its configuration and I can't stop or start it again, It didn't work again until I restart NiFi.

Please help.

20 REPLIES 20

avatar
@Mahmoud Yusuf

Have you tried setting the Scheduling Strategy using CRON driven in the flow? That way the processors should be synced when they run.

avatar
Contributor

I don't understand exactly what do you mean.
In a previous question the answer was :

"You could schedule a GenerateFlowFile at the same rate your ExecuteProcess was scheduled for, and set Ignore STDIN to true in ExecuteStreamCommand. Then the outgoing flow files will have the execution.status attribute set, which you can use with RouteOnAttribute to handle failures (non-zero exit codes, e.g.)"

I want to know how to schedule these 2 processors together so that the result is that the flow is executed every 1 minute.

avatar
@Mahmoud Yusuf

Run the processors like this. First processor, GenerateFlowFile, every minute of every hour

34586-screen-shot-2017-08-29-at-70223-pm.png

Then the next processor should run the first second of every minute of every hour

34587-screen-shot-2017-08-29-at-70501-pm.png

And then the last processor the second second of every minute of every hour

34588-screen-shot-2017-08-29-at-70558-pm.png

Do you follow?

avatar
Contributor

@Wynner Ok the schedule seems to be working, when the submitted job fails it works fine and the flow is ok.
once the job run without errors, flow files keeps generated every minute, but the ExecuteStreamCommand is stuck. I can't even stop or start it, I need to restart NiFi to run it again.

When I try to stop/start ExecuteStreamCommand it says: "No eligible components are selected. Please select the components to be stopped."

38460-screenshot-from-2017-08-30-12-38-52.png

avatar
Contributor

@Wynner
Here's what I'm trying to illustrate:
one successful execution at "ExecuteStreamCommand" then it gets stuck (flow files keeps generated but ExecuteStream is stuck):

38461-screenshot-from-2017-08-30-12-50-26.png

------

If no successful executions happens at all (All executions failed) the schedule works well as follows (flow files generated every minute, and executeStreamCommand executes every minute):

38462-screenshot-from-2017-08-30-12-59-17.png

I don't know why it gets stuck in the first case ? please help.

avatar
@Mahmoud Yusuf

The reason you cannot stop the ExecuteStreamCommand processor, is that it still has a running thread. How long does it take to run your script outside of NiFi? It seems like the script is not finishing, so the ExecuteStreamCommand processor it just waiting.

avatar
@Mahmoud Yusuf

When you say about a minute, does that mean less than a minute or more than a minute? Why don't you try generating a flow file every 2 minutes and see if that works better? Or is it possible to run the script in parallel? Give the ExecuteStreamCommand processor 2 concurrent tasks instead of one.

avatar

@Mahmoud Yusuf

In my experience, if you aren't making a call to a system level command, then the processor does have an issue sometimes.

Try putting the actual "spark-submit <path to jar>" into a shell script and then call the shell script in the ExecuteStreamCommand processor. I have found that method more reliable.

avatar
@Mahmoud Yusuf

I'm glad it is working for you now.