Member since
12-11-2017
21
Posts
4
Kudos Received
2
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
4423 | 12-21-2020 02:44 AM | |
1989 | 07-10-2020 08:42 AM |
02-14-2021
12:23 AM
Hi @mburgess Can you please elaborate on which property needs to be configured in the GrokReader controller service for using the kv filter? I'm trying to parse the incoming key=value pair. Input: key1=value1,key2=value2,key3=value3,key4=value4 output: I need key1, key2, key3, key4 as attributes and their respective values as attribute values I can use %{GREEDYDATA:msgbody} in the GrokExpression property but I do not know where to provide kv { source = "msgbody" } Your help is appreciated
... View more
12-23-2020
01:39 PM
@Anurag007 You did not share how your logs are getting in to your NiFi. But once ingested, you could use a PartitionRecord processor using one of the following readers to handle parsing your log files: - GrokReader - SyslogReader - Syslog5424Reader You can then use your choice of Record Writers to output your individual split log outputs. You would then add one custom property that is used to group like log entries by the log_level This custom property will become a new FlowFile attribute on the output FlowFiles. You can then use a RouteOnAttribute processor to filter out only FlowFiles where the log_level is set to ERROR. Here is a simple flow I created that tails NiFi's app log and partitions logs by log_level and and then routes log entries for WARN or ERROR. I use the GrokReader with the following GrokExpression %{TIMESTAMP_ISO8601:timestamp} %{LOGLEVEL:level} \[%{DATA:thread}\] %{DATA:class} %{GREEDYDATA:message} I then chose to use the JsonRecordSetWriter My dynamic i added property in the Property = log_level
Value = /level In my RouteOnAttribute processor, I can route based on that new "log_level" attribute that will exist on each partitioned FlowFile using two dynamic property which each become a new relationship: property = ERROR
value = ${log_level:equals('ERROR')}
property = WARN
value = ${log_level:equals('WARN')} Hope this helps, Matt
... View more
12-21-2020
09:44 AM
1 Kudo
Thanks Stephane, queryRecord processor works as suggested.
... View more
12-17-2020
01:24 AM
Hi @justenji Same for me, I've tried to use avro schema generator, including the inferschema from Nifi, but no luck.
... View more
12-14-2020
07:23 AM
Hello @justenji Thanks for the detailed explanation, this is also what I have tested on my side and in my opinion this does the job being "record oriented" 🙂 Have a nice day
... View more
12-09-2020
10:16 AM
Look at everything that has *Record in the name for anything like CSV, JSON, Parquet, AVRO, Logs. https://www.datainmotion.dev/2020/07/ingesting-all-weather-data-with-apache.html
... View more
11-30-2020
11:57 AM
As suggested above, update post with your processor, its reader and writier settings. It sounds like you have something misconfigured. If possible show us a screen shot of your flow too.
... View more
11-30-2020
09:09 AM
hello, Have you tried with a syslog listener on Nifi side?
... View more
11-30-2020
09:02 AM
Hello, Nifi can easily get data from a database, do you really need a Python script for that? If you really want to do what you describe here, I think you should use updateAttribute to set your set your database information in the flowfile attributes and then from your script use the getAttribute function to get it.
... View more
07-10-2020
11:47 AM
1 Kudo
I'm happy to see you resolved your issue. Please mark the appropriate reply as the solution, as it will make it easier for others to find the answer in the future.
... View more