Hi,
Just started exploring Nifi to automate some of our data analytics flows. I am executing a spark-submit command from shell script using ExecuteStreamCommand processor and need to read the output from the shell script to check if it is success of failure. I think the output is getting redirected to STDERR as a result of which I can see the truncated output in the attribute 'execution.error'. The data is getting truncated as spark output can be huge. In 'ExecuteProcessor' there is a property 'Redirect Error Stream' which is putting all the output in flow file and works for me but I can connect upstream connections to Execute Processor. Planning to send the output to a file from shell script and read the status from there or see if I can automate extracting spark application id from nifi attribute and get logs from yarn, checking if there is any better solution which I may be missing.
Thanks