Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

PutSplunk processor and Splunk group multiple syslog jsons into one event

avatar
Expert Contributor

Hi,

I'm using PutSplunk processor to sink syslogs in json format to Splunk server.

But on Splunk side, I see multiple json are grouped in one event.

10967-screen-shot-2017-01-01-at-195227.png

How can I configure my PutSplunk and Splunk server to see one json for each event?

Regards,

Wendell

1 ACCEPTED SOLUTION

avatar
Expert Contributor

Hi,

Actually, the flowfile in the queue before the PutSplunk does contain only one json.

For some reason the Splunk group them together. If I choose different json type (no timestamp) in splunk data, then each json in one event. But @Bryan Bende's "Message Delimiter" worth to be added.

Regards,

Wendell

View solution in original post

4 REPLIES 4

avatar

Make sure you split data using the SplitJson processor in NiFi before putting into Splunk. The reason is the syslog receiver may bundle incoming messages based on the network setup, but knows nothing about actual data format like json.

avatar
Master Guru

PutSplunk has two modes of operating, it can send the entire content of the flow file as a single message, or it can stream the content of a flow file and separate it based on a delimiter. The way it chooses between these modes is based on whether or not the "Message Delimiter" property is set in PutSplunk.

In your case I am assuming you have multiple JSON documents in a flow file, so you probably want to set the "Message Delimiter" to whatever is separating them, likely a \n.

avatar
Expert Contributor

Hi,

Actually, the flowfile in the queue before the PutSplunk does contain only one json.

For some reason the Splunk group them together. If I choose different json type (no timestamp) in splunk data, then each json in one event. But @Bryan Bende's "Message Delimiter" worth to be added.

Regards,

Wendell

avatar
New Contributor

Hello @Wendell Bu , I am trying same , to send events from Nifi to Splunk (using putSplunk processor) . I was stuck initially , not able to see events in splunk . My AttributetoJSON (In my view data provenance ,I see raw logs are converted to JSON format) is connected to putSplunk processor , It has hostname,port and message delimiter configured as in below screenshot . On splunk side , input port is defined . Not sure if i am missing something .Can you please let me know if there are any other steps, i need to follow ?

108425-screen-shot-2019-05-06-at-115642-am.png


Appreciate your help in advance !