Support Questions
Find answers, ask questions, and share your expertise
Check out our newest addition to the community, the Cloudera Innovation Accelerator group hub.

Metron Parser not passing messages upstream


I'm using a GROK parser, consuming from kafka queue 'auditd'. The STORM topology for the 'auditd' parser looks like it's working correctly:


We can see 9540 messages 'Acked', with only a handful of reported errors in the logs (for records which didn't match my GROK expression).

We can also see within Metron UI that it's reporting throughput:


However, this data never makes it to the enrichment or indexing topology. As we can see the STORM topology's have never received any data:



For some reason the data is not being passed properly down stream. FWIW I used the Metron UI to configure this telemetry source. I can also see json files defined for the predefined telemetry sources in:


However in these folders I do not see any configuration files for the new auditd source. I have assumed that this is not required because the REST API configures the topology via UI.

Is there anything obviously missing that might explain why these messages are not being passed upstream?

Thanks in advance


Hi Oliver,

Did you figure out the solution for above problem. I am facing similar issue.

I am using "BasicPaloAltoFirewallParser".Dont see any error while parsing.But the data is not passed onto "enrichments" or "indexing" topic.




Hi @Bharath Phatak,

I've had no luck debugging what is causing this issue yet. Just tweaking the infrastructure to resolve some other issues I've been facing. Once that's completed I will get back to debugging this issue. If I find anything I'll update you.

Cheers, Ollie

Hi @Oliver Fletcher,

Thanks for the reply. I too will update if i find anything useful.



Hi @Oliver Fletcher,

I am able to see the data being pushed into "indexing" topic and visible in Kibana.The issue was with parsing.

Please have a look in workers logs in your setup to know the issue.



Super Collaborator

If you have anything similar like that:

Check the detailed logs of the relevant topology workers. The Metron Storm workers will run on the hosts where your Storm Supervisors run.

On that host(s), the logs are usually at :

/var/log/storm/workers-artifacts/bro-2-1496396293/6700/worker.log (or something along those lines depending on what parser etc.). Find your relevant 'bro-2-1496396293' topology id via the Storm UI on port 8744.

If you only find a file ' worker.yaml ' at /var/log/storm/workers-artifacts/<topology_ID>/6700/ it means that your worker was not able to start up in the first place. Then the problem is deeper and you need to find the cause in

/var/log/storm/supervisor.log (on that same host)


If set up correctly, via the Storm UI you can sometimes drill all the way through to the Spout / Bolt which seem not doing anything, and check the logs web based. This can be done at 'files' at the level where you get the details of the "Executors (All time)"



Hi @Jasper

I have the similar issue, where the data doesn't makes it to enrichments and indexing topology, tried debugging the workers, could not find any issues or exceptions.

I could see in the Storm UI, the no data flows through ParserBolt

Is there anything else I have to look into?

FYI, I am working with the Grok Parser



Thanks in Advance

Girish N

Super Collaborator
@Girish N

From there you should check first if the Kafka topic you are consuming from has anything on it in the first place:

/usr/hdp/ --zookeeper $YOUR_ZOOKEEPER_HOST --topic $YOUR_GROK_PARSER --from-beginning

If that does not clarify, you can run the parser topology in DEBUG mode for a couple of minutes and check the worker.log again for additional details about possible errors

Thanks @Jasper

As suggested, i started the topology in Debug mode, i could see the logs like

Grok parser parsing message : type=USER_ACCT msg=audit(1507184009.344:110470): ....

Grok parser parsed message: {"ses":"4294967295","original_string": .....

Grok parser validating message: {"ses":"4294967295","original_string":.....

Grok parser did not validate message : {"ses":"4294967295","original_string":.....

Is Grok parser not validating the message the reason why data doesn't flow from $topic to enrichments?

Thanks in Advance


I was able to resolve the issue, now the data flows from my topic to enrichments.

In my case, I had missed the timestampField in the parser config, adding it resolved the issue.

Thanks for your guidance

Girish N

Super Collaborator

@Girish N

Glad it worked out for you

Super Collaborator
@Oliver Fletcher

Hi Oliver, please mark the question as answered if sufficiently answered