Member since
01-27-2023
215
Posts
61
Kudos Received
42
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
67 | 11-22-2023 12:10 AM | |
172 | 11-06-2023 12:44 AM | |
272 | 11-02-2023 02:01 AM | |
359 | 10-18-2023 11:37 PM | |
324 | 10-09-2023 12:36 AM |
10-18-2023
11:37 PM
1 Kudo
@Hae, Well if you are going to use ExecuteSQL to execute the INSERT Statement based on the content of the file, from my point of view you should first extract the value of your content as an Attribute to your FlowFile. To do that, you add an ExtractText Processor, where you define a new property named "what_you_want_to_have_it_called" and assign it the value ".*". This will extract the content of your flow file and store it in the attribute you defined as a property. NOTE: if you are going to have lots of information stored in the content of the flowfile, you will encounter so many issues doing the above mentioned. Take into consideration that this is going to work properly if working with small flowfiles, exactly like you described in your post 🙂 Next, using ExecuteSQL, you are going to execute the insert like: "insert into table value(${what_you_want_to_have_it_called}). Another option which might not involve all these steps would be to use and PutDatabaseRecord Processor, in which you can define directly the action you are trying to perform and it will handle everything for you. NOTE: this is going to work ok for larger flowfiles, as you no longer need to extract the content as attributes and NiFi will handle everything on its own. The only single downside is that you will have to configure an Record Reader.
... View more
10-17-2023
07:23 AM
@MWM, when using toString(), it should stay on Record Path Value.
... View more
10-17-2023
07:07 AM
The value for the property certainly should be like toString(/YOUR_COLUMN,"UTF-8"), without those backslashes. Have a look in the documentation for the expression language and you shall see. As for how the data will look like, that is nothing you could change. Give it a try with the correct syntax, not the one that you wrote. Maybe the data in your bytes column is stored encrypted or in another format. You should further discuss this topic with the owner of the view and understand how the view is built and how the data is stored in the column. Without those information, it is hard to establish the perfect way to extract the data correctly. Another solution would be to use a python script and execute it on the content of the AVRO File to decode the bytes column into string and send the output forward into processing. (This is not something easy to implement, especially if you do not have sudo on the nifi machines).
... View more
10-17-2023
06:23 AM
1 Kudo
Open the Controller Service for AvroRecordSetWriter, and in the field Schema Access Strategy, switch from Inherit Record Schema to Use 'Schema Text' Property. Once you select this option, a new property will be added, named Schema text. In the Value field for this property add the AVRO Schema.
... View more
10-17-2023
05:39 AM
well there you go, UUID is bytes and not string. That is the reason why your data gets displayed like that, when transformed into JSON. You need to convert the bytes into CHAR when extracting the data from the view, if you need the data as string. What you could try to do is add an UpdateRecord processor, in which you define an AVRO Reader and an AVRO Writer. In the AVRO Writer you set the schema you mentioned above, just that instead of bytes for UUID_1 you will write string. Next, in the processor, you add a new property with the same name as you affected column, starting with "/". For the value, you use NiFi's Expression Language to transform the bytes into string: toString(/YOUR_COLUMN_IN_BYTES_FORMAT, "UTF-8"). If that works, you can modify the flow and instead of writing the data with the AVRO Writer, you can modify the flow and set JSON directly and skip the step for ConvertAvroToJson.
... View more
10-17-2023
04:57 AM
To see the content of the file, you can use NiFi. However, to see the generated schema, you will need an IDE like IntelliJ with the AVRO/Parquet plugin or you can search for an online avro reader and upload your data there.
... View more
10-17-2023
04:21 AM
@MWM, appears as a STRING is not the same as IT IS A STRING 🙂 Go in your database and check what data type your column has. Besides that, download the file generated by QueryDatabaseTable and open it with an AVRO Reader and see what AVRO Schema has been generated and what avro type you have assigned for that specific column. I am pretty certain that you are not working with strings here but you will get your confirmation once you check the above mentioned.
... View more
10-17-2023
12:39 AM
As you know, EOF is an exception in Java that occurs when an end of file or end of stream is reached unexpectedly during input. In your case, if you receive this error, most likely you are sending the API call somehow wrong and you need to have a look at it. In addition to this, check the nifi-app.log file to see the entire stack trace of the error message, as the bulletin board might not show you the entire error message 🙂 Besides that, are you certain that you have open connectivity between your NiFi Instance and the API Endpoint you are trying to call?
... View more
10-16-2023
06:41 AM
@MWM, How did you define the schema you are using to fetch the AVRO Data and how did you define the schema for writing AVRO to JSON? What column type is the UUID in your database?
... View more
10-16-2023
06:39 AM
1 Kudo
@AhmedParvez, If you need assistance with a problem, please make sure that you write everything in the post. Right now, you post does not contain any information regarding what you are trying to do 🙂 What is the input of that InvokeHTTP Processor and what are you trying to call and how? What is the answer you expect? So basically the only possible answer you can get from how you described your post is: You have a problem in your processor. Take a look into what you are doing and correct your mistake, as this will solve your problem.
... View more