I use ExecuteStreamCommand to submit a spark job using the "spark-submit" command and it's scheduled to run every minute.
I noticed a strange behavior from the processor and when I opened spark UI I found that the job gets stuck at a certain task and still so with no progress. the processor also gets stuck.
I tried to submit the job using the same command I used in the processor but from the terminal (command line) outside NiFi and the job executes and terminates successfully.
I don't know what is the reason for this strange behaviour.
When I run the same command from "ExecuteProcess" processor it completes successfully. But I need to submit the job using ExecuteStreamCommand so that I can detect the job failure.
