Member since
08-19-2024
30
Posts
8
Kudos Received
1
Solution
My Accepted Solutions
Title | Views | Posted |
---|---|---|
2293 | 08-22-2024 12:41 AM |
09-03-2024
11:09 PM
2 Kudos
Thanks, it was ReplaceText processor and this regex really helped
... View more
09-03-2024
12:09 AM
1 Kudo
Hello @SAMSAL I have used ExtractText processor. This processor has an inbuilt property "Search Value" which I filled as ^(.{5})(.{10}) and in property "replacement value" as $1,$2 My Extract Processor conf is below I also want to have whitespaces in my address. like if it is: smithAb Cd 12345678 then i want user name to be smit and address to be Ab Cd 1234. Also I want to point out that I am basically constructing a comma separated flowfile by using this. The comma "," in $1,$2 makes them comma separated at the end. The issue is that all works fine but this last $2 is not limiting to only 10 chars but taking chars after the 10 char also. so it ultimately becomes Ab Cd 12345678 instead of Ab Cd 1234 As I was speaking of some experimentation, I observe that if I do "Search Value" which I filled as ^(.{5})(.{10})(.{1}) and in property "replacement value" as $1,$2,$3 then I observe that both username and address comes proper as expected. now this $3 replaced value contains the extra until last characters.
... View more
09-02-2024
03:19 AM
1 Kudo
Hello, I have a simple use case. I have an incoming file having say user name and user address in each line. User name is from char 1 to char 5 and user address is from char 6 to char 15. How to use ExtractText processor for it. I tried using Search Value as ^(.{5})(.{10}) and in replacement value as $1,$2 Issue I am having is that for address it is capturing all the chars from 6th char to last and not necessarily upto 15th char. What should I modify? Just for experimentation I tried doing ^(.{5})(.{10})(.{1}) and $1,$2,$3 and this is able to capture properly from 6th char to 15th char. Please help.
... View more
Labels:
- Labels:
-
Apache NiFi
08-28-2024
08:42 AM
I have a simple file having say 1000 lines each of which contains say username and useraddress. I need to insert it into db. How to insert it in order. I am able to use PutFile and PutDatabaseRecord both for the insertion purpose. But they insert randomly in any order. I wanted it to be inserted in the same order it is present in the file. How to do that.
... View more
Labels:
- Labels:
-
Apache NiFi
08-28-2024
08:39 AM
Say I have a file having 100 records. In this file I want to ignore the first line as it's data I do not want to process (It contains some info like file name, date and number of records). I want to process from second line. How to do that?
... View more
Labels:
- Labels:
-
Apache NiFi
08-28-2024
08:32 AM
hello, one stackoverflow answer suggested to configure property "Record Reader"-> "Csv Reader". Property of Controller Service (Csv Reader), "Schema Access Strategy" to be "Use String Fields from header". This helped. You are telling about Avro Schema. Could you please give some insights. How to configure that? Also I have one more different topic question. Say I have a file having 100 records. In this file I want to ignore the first line as it's data I do not want to process. I want to process from second line. How to do that? Thanks
... View more
08-23-2024
07:06 AM
1 Kudo
In nifi I am using PutDatabaseRecord. When i am inserting data to the db then why is 0 being omitted like if it is 0123 then in db only 123 is inserting in the table. Note: If I use PutSql then it is not omitting leading 0. How to configure PutDatabaseRecord to not omit leading zeros.
... View more
Labels:
- Labels:
-
Apache NiFi
08-22-2024
12:42 AM
1 Kudo
Thanks @VidyaSargur Also I have replied which seems to be solution (property Support Framgmented Transactions as false in PutSql Processor) which seems to be helping.
... View more
08-22-2024
12:41 AM
Hello I had asked this question. Now I think I have got a solution. In PutSql Processor, there is a property "Support Fragmented Transactions". If I set its value to be false, then it is proceeding and not penalising the flow files.(If we read more into it, it says something about fragment.identifier and count due to which it waits for all flow files.) Note: It also mentions it is for respecting Atomicity. Anyone can suggest if having this property as false is proper or not. As right now if I am updating it to false then it is not penalising.
... View more
08-19-2024
04:37 AM
1 Kudo
I have a requirement to read say name of users from a file(all in new lines) and insert to Database(oracle). I am using GetFile->SplitText->ExtractText->PutSql in NiFi. All works fine for less records(around 10 records). But when I try it for 50 or 100 plus records, the connection between ExtractText and PutSql, I am getting "A flowfile is currently penalized and data can not be processed at this time". Also all flowfiles remain in queue. Note in PutSQL I am using simple sql (insert into users_table(user_name) values ('user123'); It works properly for 10, 20 records but when I provide 100+ records then it gets stuck.
... View more
Labels:
- Labels:
-
Apache NiFi
- « Previous
- Next »