Member since
01-27-2023
229
Posts
73
Kudos Received
45
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
622 | 02-23-2024 01:14 AM | |
783 | 01-26-2024 01:31 AM | |
560 | 11-22-2023 12:28 AM | |
1253 | 11-22-2023 12:10 AM | |
1450 | 11-06-2023 12:44 AM |
10-17-2023
04:21 AM
@MWM, appears as a STRING is not the same as IT IS A STRING 🙂 Go in your database and check what data type your column has. Besides that, download the file generated by QueryDatabaseTable and open it with an AVRO Reader and see what AVRO Schema has been generated and what avro type you have assigned for that specific column. I am pretty certain that you are not working with strings here but you will get your confirmation once you check the above mentioned.
... View more
10-17-2023
12:39 AM
As you know, EOF is an exception in Java that occurs when an end of file or end of stream is reached unexpectedly during input. In your case, if you receive this error, most likely you are sending the API call somehow wrong and you need to have a look at it. In addition to this, check the nifi-app.log file to see the entire stack trace of the error message, as the bulletin board might not show you the entire error message 🙂 Besides that, are you certain that you have open connectivity between your NiFi Instance and the API Endpoint you are trying to call?
... View more
10-16-2023
06:41 AM
@MWM, How did you define the schema you are using to fetch the AVRO Data and how did you define the schema for writing AVRO to JSON? What column type is the UUID in your database?
... View more
10-16-2023
06:39 AM
1 Kudo
@AhmedParvez, If you need assistance with a problem, please make sure that you write everything in the post. Right now, you post does not contain any information regarding what you are trying to do 🙂 What is the input of that InvokeHTTP Processor and what are you trying to call and how? What is the answer you expect? So basically the only possible answer you can get from how you described your post is: You have a problem in your processor. Take a look into what you are doing and correct your mistake, as this will solve your problem.
... View more
10-11-2023
07:29 AM
@Mercy, your MergeRecord Processor does not perform any action because you set a minimum number of records equal to a very big value. You need to know that the Property Minimum Number of Records is a hard limit, whereas the Maximum Number of Records is a soft limit. This basically means that your processor will wait until the queue coming in the processor has at least X records present, where X = the value you set in the processor as Minimum. If you want to merge those files and not wait until you reach that mentioned value, you have two options: 1) You either decrease the value set in Minimum Number of Records. 2) You set a Max Bin Age. This property works as a counter. If you set here 120 minutes or 75 seconds, the processor will wait the specific amount of time and merge the presents record, ignoring the value set for Minimum Number of Record.
... View more
10-11-2023
06:44 AM
1 Kudo
Again, you answer is a little bit vague and it does not provide any information whatsoever, so therefor you answer will be quite generic. You can use any Processor you want to extract the data from your source database: GenerateTableFetch+ExecuteSQLRecord, ExecuteSQLRecord, ExecuteSQL, QueryDatabaseTable, QueryDatabaseTableRecord. In all of them you will have to define a DBCP Connection Service so that you can connect to your database. In the processors with Record, you can define the type of the output you will see (Avro, Parquet, JSON, etc) Afterwards, you do whatever processing you need and assuming that you will use an RestAPI, you can use the InvokeHTTP Processor to call that API Endpoint, with whatever parameters you require. Take note that if those parameters are inside the flowfile, you will need to extract them as attributes, meaning that you will have to add some extra processing.
... View more
10-10-2023
06:46 AM
@Abhiram-4455, can you please be more specific in your request? What are you trying to do exactly? What do you mean by move? What do you mean by sql db - there are plenty of db types? And so on.
... View more
10-09-2023
06:22 AM
@s300570, If I understood your requirement correctly, you might try something like: WITH WorkingDaysCTE AS (
SELECT
ref_date,
ROW_NUMBER() OVER (ORDER BY ref_date) AS row_num
FROM
cd_estruturais.calendario_datas
WHERE
ref_date >= CURRENT_DATE -- You can adjust the starting date as needed
AND civil_util = 1
)
SELECT
c1.*,
w1.ref_date AS next_wkday,
DATEDIFF(w1.ref_date, c1.ref_date) AS Num_Days
FROM
cd_estruturais.calendario_datas c1
JOIN
WorkingDaysCTE w1 ON c1.ref_date <= w1.ref_date
AND w1.row_num = (
SELECT MIN(row_num)
FROM WorkingDaysCTE w2
WHERE w2.row_num > w1.row_num
)
WHERE
c1.ref_date >= CURRENT_DATE
AND c1.civil_util = 1
LIMIT 1; Now, just as info, as I no longer have access to an Impala/Hive system, the above query was written as a standard SQL, so you might want to double check the syntax. In terms of explanations: The common table expression (CTE) named WorkingDaysCTE is used to filter the dates with civil_util = 1 and assign a row number to each date based on their order. The main query then joins the calendario_datas table with the CTE on the condition that the date in the calendar table is less than or equal to the date in the CTE. The subquery in the SELECT clause is used to find the minimum row number greater than the current row number in the CTE. This helps in getting the next working day. The DATEDIFF function is used to calculate the number of days between the ref_date and the next_wkday.
... View more
10-09-2023
12:36 AM
1 Kudo
Well the only advise I can give you is to write your processor and see what errors you have and come back with them. Nobody can write your processor if only you know your requirements. What I can suggest you though, is to have a look at the following examples, as they might assist you with what you are trying to achieve: https://community.cloudera.com/t5/Community-Articles/ExecuteScript-Cookbook-part-1/ta-p/248922 https://community.cloudera.com/t5/Community-Articles/ExecuteScript-Cookbook-part-2/ta-p/249018 https://community.cloudera.com/t5/Community-Articles/ExecuteScript-Cookbook-part-3/ta-p/249148
... View more
10-06-2023
05:42 AM
@manishg, what exactly are you trying to achieve? Can you be a little bit more specific? In NiFi you have plenty of processors which will split your data but it depends on what you are trying to split and how. For example you could use an SplitRecord Processors which basically does exactly what you need, it splits an input FlowFilw into multiple smaller FlowFiles. All you have to do is configure a Reader and a Writer, so that NiFi knows how to read and how to write the data. Afterwards, you set how many record you want on each new flowfile .... and that is all. You can also you SplitAvro, SplitJSON, SplitContent, SplitText and even SplitXML, depending on what you are actually trying to achieve with your entire Flow. Another option would be to use a Scripting processor like ExecuteStreamCommand or ExecuteScript, in which you define and configure a script to perform the logic you are looking for. And there are many other options which you could also try. So, what I am trying to say is that if you want a specific answer to your question, you need to say exactly what you are trying to do. Otherwise, the above lines should be sufficient to understand that what you are trying to achieve in NiFi is possible 🙂 PS: don't try and split a CSV File with million of lines into individual flow files containing 1 line, because you will regret doing that 🙂
... View more