Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

Is there a best practice for logging within HDP

avatar
New Contributor
 
1 ACCEPTED SOLUTION

avatar
Guru

You can use nifi to ingest logs into solr/elasticsearch and visualize using banana/kibana. You can also use nifi to ingest them into splunk. Since all hadoop component logs use log4j, it is easy to standardize log collection.

View solution in original post

5 REPLIES 5

avatar
Master Mentor

avatar
New Contributor

Thank @Artem Ervits some of these links are helpful, however let me clarify a little more...

As part of hadoop applications we will be generating logs at different levels by different components . You will typically have end user computing along with hadoop processes like hive pig MR etc. So what is the best practice there, given that the application may have some components outside the hadoop cluster, how can you get end to end view of what happened from logging for an application

avatar
Master Mentor

I happen to agree with @Ravi Mutyala nifi is great to bring logs into hadoop and track logs at the edge, there are built-in parsers and filters, you'll feel right at home with Nifi. Here are some nifi templates including working with logs https://cwiki.apache.org/confluence/display/NIFI/Example+Dataflow+Templates

avatar
Guru

You can use nifi to ingest logs into solr/elasticsearch and visualize using banana/kibana. You can also use nifi to ingest them into splunk. Since all hadoop component logs use log4j, it is easy to standardize log collection.

avatar

You might want to look at this => https://github.com/abajwa-hw/logsearch-service

Great tool for analyzing logs 🙂