Created 10-20-2019 07:26 AM
Hi,
Just started exploring Nifi to automate some of our data analytics flows. I am executing a spark-submit command from shell script using ExecuteStreamCommand processor and need to read the output from the shell script to check if it is success of failure. I think the output is getting redirected to STDERR as a result of which I can see the truncated output in the attribute 'execution.error'. The data is getting truncated as spark output can be huge. In 'ExecuteProcessor' there is a property 'Redirect Error Stream' which is putting all the output in flow file and works for me but I can connect upstream connections to Execute Processor. Planning to send the output to a file from shell script and read the status from there or see if I can automate extracting spark application id from nifi attribute and get logs from yarn, checking if there is any better solution which I may be missing.
Thanks
Created 01-16-2020 12:31 AM
Hi,
I am facing the same issue. Did you get any solution for your problem?
Created 01-16-2020 05:52 AM
I was also facing a similar kind of issue this code helped me!
Try this link:
https://github.com/mnemonic-no/act/blob/master/example-config/scio-act-workflow-2019-11-22.xml
It might help you!
Regards,
Lewis