Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

Read the flow file of ExecuteStreamCommand

avatar
Rising Star

I am executing python script in ExecuteStreamCommand and my code last line dumps the json key values. 
My goal is to read those key values into PutDatabaseRecord processor and insert the values in my database. However, I am unable to read content of the flow file of ExecuteStream. I have used ConvertAvrotoJson but it is showing output as "not a avro data file". Is there any workarounf of it? Thanks.

1 ACCEPTED SOLUTION

avatar

If you are able to store the display name and the mail values into flowfile attributes, then the easiest way is to use the PutSQL processor to create your insert statement of those values and other values you want to ingest into the target table. In the PutSQL you can specify the insert statement in the SQL Statement property as follows:

 

insert into myTable(displayNameCol, mailCol, otherCol...) values
('${displayNameAttribute}','${mailAttribute}', 'otherValue',...)

 

The AttributeToJson should work as well if you specify which attributes you want to insert into the DB to be converted as Json flowfile and then use the PutDatabaseRecord with JsonTreeReader where  you specify that the Statement Type property as INSERT.  The only caveat here is in the PutDatabaseRecord the attribute name should match the table column names.

If that helps please accept solution.

Thanks

 

 

 

View solution in original post

6 REPLIES 6

avatar

Hi @Kiranq ,

 

If you are getting the output form the ExecuteStreamCommand as Json  I dont understand why are you using the ConvertAvroJson Processor. In the PutDatabaseRecrod you can specify the Record Reader property as JsonTreeReader   and consume it as such. I guess if you can shed more light on how the executestream output looks like maybe we can help you better.

 

If that helps please accept solution.

Thanks

avatar
Rising Star

Thanks @SAMSAL for responding. I will explain what I am trying to accomplish.
The ExecuteStreamCommand gives me below results:

Kiranq_0-1693340857298.png


and I want to use displayName and mail key's values from this result and store it into the database. My current workflow is based on the flow that I will store these two key's value as a attribute and use it in the subsequent processors. However, using AttributestoJson aint also working for me.

Can you suggest any other better appraoach to ingest these two values in my database? I have other columns' value to ingest that as well so may be ExecuteSQLCommand processor would be a better choice to use.  

avatar

If you are able to store the display name and the mail values into flowfile attributes, then the easiest way is to use the PutSQL processor to create your insert statement of those values and other values you want to ingest into the target table. In the PutSQL you can specify the insert statement in the SQL Statement property as follows:

 

insert into myTable(displayNameCol, mailCol, otherCol...) values
('${displayNameAttribute}','${mailAttribute}', 'otherValue',...)

 

The AttributeToJson should work as well if you specify which attributes you want to insert into the DB to be converted as Json flowfile and then use the PutDatabaseRecord with JsonTreeReader where  you specify that the Statement Type property as INSERT.  The only caveat here is in the PutDatabaseRecord the attribute name should match the table column names.

If that helps please accept solution.

Thanks

 

 

 

avatar
Rising Star

Yes, I ended up changing my ExecuteScript in a way that all column's values are now attributes of my flowFile. Next, I have added PutSql processor and inserting data using ${attributename}. 

Thanks @SAMSAL for responding. 

avatar
Rising Star

@SAMSAL, hey I know I accepted the solution and it did work for when I had 1 user details to fetch and store it's values to the processor's attributes.

I wanted to ask if it is possible to store multiple values to a single attribute of a processor since I now have many users' details to store. Is this possible otherwise I would have to change the logic? Thanks.

avatar

If you are getting multiple records in one Json array , then you probably need to use SplitJosn Processor to get each record individually , then extract the values you need using EvaluateJsonPath from each record then do the PutSQL.