Support Questions
Find answers, ask questions, and share your expertise

NiFi : Flowfile as input/output to pyspark scripts in ExecuteStreamCommand processor

I had been using ExecuteStreamCommand processor to run python scripts and successfully giving flowfile as input to python scripts using sys.stdin and flowfile output as sys.stdout. Now all of these python scripts to be converted to Pyspark. How can have flow file as input/output in ExecuteStreamCommand Processor. Sys.stdin and sys.stdout does not seems to be working with pyspark. Is there any other way?