Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

How to pass a huge json output (Flow File) to execute stream Command ?

How to pass a huge json output (Flow File) to execute stream Command ?

New Contributor

Hi,

I am calling a python script where i am passing a huge JSON as an input, Even though i have increased the length limit of attribute in ExecuteStreamCommand my JSON input is getting truncated. How to overcome this error.

Thank you in advance,

Subash

3 REPLIES 3

Re: How to pass a huge json output (Flow File) to execute stream Command ?

Is your JSON input coming in as the content of the incoming flow file or an attribute? If the latter, it may have already been truncated (you can check with List Queue before starting the ExecuteStreamCommand processor). Also how big is your input JSON, what is the expected output, and what did you set the outgoing attribute length to?

Re: How to pass a huge json output (Flow File) to execute stream Command ?

New Contributor

Basically My JSON file contains schema info. For example, If a table has 300 columns, the JSON input has the attributes of column such as column name, datatypes e.t.c.

Yes, the JSON input is coming as the content of flow file, from JSON i am extracting the JSON as an attribute and I do have changed the attribute length to 10 k but even then the JSON input is getting truncated.

Re: How to pass a huge json output (Flow File) to execute stream Command ?

New Contributor

NiFi is cool. In my python script i was passing JSON as input and then i was doing transformation on JSON. After transformation, I used to invoke HTTP post request.I resolved my issue of JSON by using inbuilt processor of NiFi (EvaluateJsonPath -- > SplitJson --- > InvokeHTTP). 100 line of code get reduced only to 3 processors :)