Hello, i need execute a spark job from NiFi. The file .jar is in HDFS and i need know if the job finish success or fail.
What is the best option?
it's possible launch spark-submit command and receive the status?
Yes you can run ephemeral spark jobs via NiFi please refer https://community.hortonworks.com/repos/64179/launching-spark-jobs-from-nifi.html and https://github.com/diegobaez/PUBLIC/tree/master/NiFi-SnapSpark
Note: Please upvote or mark this answer accepted if it resolves you found it useful
@PankajKadam The spark that i need launch is indepent of the flow, I mean, I need to launch a spark-submmit command and receive the status without flowfile.