Created on 07-19-2017 03:50 PM - edited 08-18-2019 01:47 AM
I'm using a GROK parser, consuming from kafka queue 'auditd'. The STORM topology for the 'auditd' parser looks like it's working correctly:
We can see 9540 messages 'Acked', with only a handful of reported errors in the logs (for records which didn't match my GROK expression).
We can also see within Metron UI that it's reporting throughput:
However, this data never makes it to the enrichment or indexing topology. As we can see the STORM topology's have never received any data:
For some reason the data is not being passed properly down stream. FWIW I used the Metron UI to configure this telemetry source. I can also see json files defined for the predefined telemetry sources in:
/usr/metron/0.4.0/config/zookeeper/{indexing,parsers,enrichments}
However in these folders I do not see any configuration files for the new auditd source. I have assumed that this is not required because the REST API configures the topology via UI.
Is there anything obviously missing that might explain why these messages are not being passed upstream?
Thanks in advance
Created 07-27-2017 01:20 PM
Hi Oliver,
Did you figure out the solution for above problem. I am facing similar issue.
I am using "BasicPaloAltoFirewallParser".Dont see any error while parsing.But the data is not passed onto "enrichments" or "indexing" topic.
Thanks,
Bharath
Created 07-27-2017 02:20 PM
Hi @Bharath Phatak,
I've had no luck debugging what is causing this issue yet. Just tweaking the infrastructure to resolve some other issues I've been facing. Once that's completed I will get back to debugging this issue. If I find anything I'll update you.
Cheers, Ollie
Created 07-27-2017 04:49 PM
Hi @Oliver Fletcher,
Thanks for the reply. I too will update if i find anything useful.
Regards,
Bharath
Created 08-03-2017 08:31 AM
Hi @Oliver Fletcher,
I am able to see the data being pushed into "indexing" topic and visible in Kibana.The issue was with parsing.
Please have a look in workers logs in your setup to know the issue.
Thanks,
Bharath
Created on 08-16-2017 09:12 AM - edited 08-18-2019 01:47 AM
If you have anything similar like that:
Check the detailed logs of the relevant topology workers. The Metron Storm workers will run on the hosts where your Storm Supervisors run.
On that host(s), the logs are usually at :
/var/log/storm/workers-artifacts/bro-2-1496396293/6700/worker.log (or something along those lines depending on what parser etc.). Find your relevant 'bro-2-1496396293' topology id via the Storm UI on port 8744.
If you only find a file ' worker.yaml ' at /var/log/storm/workers-artifacts/<topology_ID>/6700/ it means that your worker was not able to start up in the first place. Then the problem is deeper and you need to find the cause in
/var/log/storm/supervisor.log (on that same host)
Alternatively:
If set up correctly, via the Storm UI you can sometimes drill all the way through to the Spout / Bolt which seem not doing anything, and check the logs web based. This can be done at 'files' at the level where you get the details of the "Executors (All time)"
:
Created 10-06-2017 12:44 PM
Hi @Jasper
I have the similar issue, where the data doesn't makes it to enrichments and indexing topology, tried debugging the workers, could not find any issues or exceptions.
I could see in the Storm UI, the no data flows through ParserBolt
Is there anything else I have to look into?
FYI, I am working with the Grok Parser
screenshot-from-2017-10-06-17-18-55.png
screenshot-from-2017-10-06-17-19-10.png
Thanks in Advance
Girish N
Created 10-09-2017 08:43 AM
From there you should check first if the Kafka topic you are consuming from has anything on it in the first place:
/usr/hdp/2.5.3.0-37/kafka/bin/kafka-console-consumer.sh --zookeeper $YOUR_ZOOKEEPER_HOST --topic $YOUR_GROK_PARSER --from-beginning
If that does not clarify, you can run the parser topology in DEBUG mode for a couple of minutes and check the worker.log again for additional details about possible errors
Created 10-09-2017 09:40 AM
Thanks @Jasper
As suggested, i started the topology in Debug mode, i could see the logs like
Grok parser parsing message : type=USER_ACCT msg=audit(1507184009.344:110470): ....
Grok parser parsed message: {"ses":"4294967295","original_string": .....
Grok parser validating message: {"ses":"4294967295","original_string":.....
Grok parser did not validate message : {"ses":"4294967295","original_string":.....
Is Grok parser not validating the message the reason why data doesn't flow from $topic to enrichments?
Thanks in Advance
Created 10-09-2017 10:05 AM
I was able to resolve the issue, now the data flows from my topic to enrichments.
In my case, I had missed the timestampField in the parser config, adding it resolved the issue.
Thanks for your guidance
Girish N
Created 10-09-2017 06:28 PM
Glad it worked out for you
Created 10-09-2017 08:23 AM
Hi Oliver, please mark the question as answered if sufficiently answered