Support Questions
Find answers, ask questions, and share your expertise

How to pass a huge json output (Flow File) to execute stream Command ?

Hi,

I am calling a python script where i am passing a huge JSON as an input, Even though i have increased the length limit of attribute in ExecuteStreamCommand my JSON input is getting truncated. How to overcome this error.

Thank you in advance,

Subash

3 REPLIES 3

Super Guru

Is your JSON input coming in as the content of the incoming flow file or an attribute? If the latter, it may have already been truncated (you can check with List Queue before starting the ExecuteStreamCommand processor). Also how big is your input JSON, what is the expected output, and what did you set the outgoing attribute length to?

Basically My JSON file contains schema info. For example, If a table has 300 columns, the JSON input has the attributes of column such as column name, datatypes e.t.c.

Yes, the JSON input is coming as the content of flow file, from JSON i am extracting the JSON as an attribute and I do have changed the attribute length to 10 k but even then the JSON input is getting truncated.

NiFi is cool. In my python script i was passing JSON as input and then i was doing transformation on JSON. After transformation, I used to invoke HTTP post request.I resolved my issue of JSON by using inbuilt processor of NiFi (EvaluateJsonPath -- > SplitJson --- > InvokeHTTP). 100 line of code get reduced only to 3 processors 🙂