I use ExecuteStreamCommand to submit a spark job using the "spark-submit" command and it's scheduled to run every minute. I noticed a strange behavior from the processor and when I opened spark UI I found that the job gets stuck at a certain task and still so with no progress. the processor also gets stuck.
I tried to submit the job using the same command I used in the processor but from the terminal (command line) outside NiFi and the job executes and terminates successfully. I don't know what is the reason for this strange behaviour.
When I run the same command from "ExecuteProcess" processor it completes successfully. But I need to submit the job using ExecuteStreamCommand so that I can detect the job failure.
Can you please check the parameter "Ignore STDIN", if it set to false it will ignore the input flowfile content and proceed further.
however, from spark UI, you can see it was executing an action and you can see by clicking that link and get to know where it struck, i presume there must be additional data or environment change may cause that issue.