- Subscribe to RSS Feed
- Mark Question as New
- Mark Question as Read
- Float this Question for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page
Nifi 0.7 putsplunk processor to send log files to splunk for nifi alerts?
- Labels:
-
Apache NiFi
Created ‎08-23-2016 11:04 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Iam new to Alerting & Monitoring. If we want to setup alerts for nifi using splunk can we use putsplunk nifi processor or send log files directly to splunk?
Currently we are having applications use splunk where they send the log files directly to splunk for alerting. Which is the effective way to acheive monitoring and alerting for nifi using splunk? Thank you
Created ‎08-23-2016 02:49 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
You probably have a couple of options...
I don't think you want the same NiFi instance that is running your main dataflow, also using PutSplunk to monitor itself. If you had TailFile -> PutSplunk, where TailFile was tailing the same instance, it would potentially create a cycle where the more your tailed and sent to splunk, the more logs produced, the more you tailed, the more logs you produced, etc.
I would suggest a second NiFi instance (maybe even the MiNiFi Java agent) to monitor the logs of the main instance.
Another possibly simpler solution... configure the NiFi logback.xml to add a UDP/TCP appender that can send logs to Splunk. This way anything NiFi logs to nifi-app.log will get forwarded to Splunk.
Last option, slightly different than logging, NiFi has a concept called a ReportingTask that can be used send metrics and statistics to other systems. If that was the information you were interested in, you could implement a custom ReportingTask to send data to Splunk.
Created ‎08-23-2016 02:49 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
You probably have a couple of options...
I don't think you want the same NiFi instance that is running your main dataflow, also using PutSplunk to monitor itself. If you had TailFile -> PutSplunk, where TailFile was tailing the same instance, it would potentially create a cycle where the more your tailed and sent to splunk, the more logs produced, the more you tailed, the more logs you produced, etc.
I would suggest a second NiFi instance (maybe even the MiNiFi Java agent) to monitor the logs of the main instance.
Another possibly simpler solution... configure the NiFi logback.xml to add a UDP/TCP appender that can send logs to Splunk. This way anything NiFi logs to nifi-app.log will get forwarded to Splunk.
Last option, slightly different than logging, NiFi has a concept called a ReportingTask that can be used send metrics and statistics to other systems. If that was the information you were interested in, you could implement a custom ReportingTask to send data to Splunk.
Created ‎02-22-2017 11:12 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi Bryan,
How to use logback.xml for logging into Splunk. Any suggestion or link or example would be helpful.
Thanks,
Created ‎02-22-2017 03:36 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
You would need to setup an appender in logback.xml that could talk to a Splunk input. One example would be a SocketAppender to send to a TCP input in Splunk.
https://github.com/bbende/jsonevent-producer/blob/master/src/main/resources/logback.xml#L8-L11
Created ‎02-23-2017 05:57 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
@Bryan Thanks Bryn. I will try this and will let you know if this works or not.
