Member since
12-16-2020
1
Post
0
Kudos Received
0
Solutions
12-23-2020
01:39 PM
@Anurag007 You did not share how your logs are getting in to your NiFi. But once ingested, you could use a PartitionRecord processor using one of the following readers to handle parsing your log files: - GrokReader - SyslogReader - Syslog5424Reader You can then use your choice of Record Writers to output your individual split log outputs. You would then add one custom property that is used to group like log entries by the log_level This custom property will become a new FlowFile attribute on the output FlowFiles. You can then use a RouteOnAttribute processor to filter out only FlowFiles where the log_level is set to ERROR. Here is a simple flow I created that tails NiFi's app log and partitions logs by log_level and and then routes log entries for WARN or ERROR. I use the GrokReader with the following GrokExpression %{TIMESTAMP_ISO8601:timestamp} %{LOGLEVEL:level} \[%{DATA:thread}\] %{DATA:class} %{GREEDYDATA:message} I then chose to use the JsonRecordSetWriter My dynamic i added property in the Property = log_level
Value = /level In my RouteOnAttribute processor, I can route based on that new "log_level" attribute that will exist on each partitioned FlowFile using two dynamic property which each become a new relationship: property = ERROR
value = ${log_level:equals('ERROR')}
property = WARN
value = ${log_level:equals('WARN')} Hope this helps, Matt
... View more