Support Questions
Find answers, ask questions, and share your expertise

Spark job gets stuck when submitted from NiFi ExecuteStreamCommand

I use ExecuteStreamCommand to submit a spark job using the "spark-submit" command and it's scheduled to run every minute.
I noticed a strange behavior from the processor and when I opened spark UI I found that the job gets stuck at a certain task and still so with no progress. the processor also gets stuck.

I tried to submit the job using the same command I used in the processor but from the terminal (command line) outside NiFi and the job executes and terminates successfully.
I don't know what is the reason for this strange behaviour.

When I run the same command from "ExecuteProcess" processor it completes successfully. But I need to submit the job using ExecuteStreamCommand so that I can detect the job failure.

38514-screenshot-from-2017-09-05-14-34-41.png

2 REPLIES 2

New Contributor

@Mahmoud Yusuf are you calling spark-submit directly from ExecuteStreamCommand or wrapped in s script ?

Super Collaborator

Hi @Mahmoud Yusuf,

Can you please check the parameter "Ignore STDIN", if it set to false it will ignore the input flowfile content and proceed further.

however, from spark UI, you can see it was executing an action and you can see by clicking that link and get to know where it struck, i presume there must be additional data or environment change may cause that issue.

Hope this helps !!