Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

Nifi generating huge logs

Nifi generating huge logs

I have configured Nifi putElasticSearch processor to store data to elasticSearch instance where shield plugin is enabled. I just tried the shield configuration for the putElasticSearch processor and stopped the processor as it was throwing errors. But I see that in the nifi-app.log the same error messages are being generated continuously (below kind of error messages and generating MB's of logs in 5 seconds approximately)

************************Error Message*****************

2017-10-23 12:55:02,730 INFO [elasticsearch[Martin Gold][generic][T#16]] org.elasticsearch.client.transport [Martin Gold] failed to get node info for {#transport#-1}{xx.x.xx.xx}{xx.x.xx.xx:9300}, disconnecting... org.elasticsearch.transport.RemoteTransportException: [ip-xx-x-xx-xx.ec2.internal][xx.x.xx.xx:9300][cluster:monitor/nodes/liveness] Caused by: org.elasticsearch.ElasticsearchSecurityException: missing authentication token for action [cluster:monitor/nodes/liveness] at org.elasticsearch.shield.support.Exceptions.authenticationError(Exceptions.java:39) at org.elasticsearch.shield.authc.DefaultAuthenticationFailureHandler.missingToken(DefaultAuthenticationFailureHandler.java:70) at org.elasticsearch.shield.authc.InternalAuthenticationService.authenticateWithRealms(InternalAuthenticationService.java:282) at org.elasticsearch.shield.authc.InternalAuthenticationService.authenticate(InternalAuthenticationService.java:184) at org.elasticsearch.shield.transport.ServerTransportFilter$NodeProfile.inbound(ServerTransportFilter.java:112) at org.elasticsearch.shield.transport.ShieldServerTransportService$ProfileSecuredRequestHandler.messageReceived(ShieldServerTransportService.java:176) at org.elasticsearch.transport.RequestHandlerRegistry.processMessageReceived(RequestHandlerRegistry.java:75) at org.elasticsearch.transport.netty.MessageChannelHandler.handleRequest(MessageChannelHandler.java:245) at org.elasticsearch.transport.netty.MessageChannelHandler.messageReceived(MessageChannelHandler.java:114) at org.jboss.netty.channel.SimpleChannelUpstreamHandler.handleUpstream(SimpleChannelUpstreamHandler.java:70) at org.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564) at org.jboss.netty.channel.DefaultChannelPipeline$DefaultChannelHandlerContext.sendUpstream(DefaultChannelPipeline.java:791) at org.jboss.netty.channel.Channels.fireMessageReceived(Channels.java:296) at org.jboss.netty.handler.codec.frame.FrameDecoder.unfoldAndFireMessageReceived(FrameDecoder.java:462) at org.jboss.netty.handler.codec.frame.FrameDecoder.callDecode(FrameDecoder.java:443) at org.jboss.netty.handler.codec.frame.FrameDecoder.messageReceived(FrameDecoder.java:303) at org.jboss.netty.channel.SimpleChannelUpstreamHandler.handleUpstream(SimpleChannelUpstreamHandler.java:70) at org.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564) at org.jboss.netty.channel.DefaultChannelPipeline$DefaultChannelHandlerContext.sendUpstream(DefaultChannelPipeline.java:791) at org.elasticsearch.common.netty.OpenChannelsHandler.handleUpstream(OpenChannelsHandler.java:75) at org.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564) at org.jboss.netty.channel.DefaultChannelPipeline$DefaultChannelHandlerContext.sendUpstream(DefaultChannelPipeline.java:791) at org.jboss.netty.handler.ipfilter.IpFilteringHandlerImpl.handleUpstream(IpFilteringHandlerImpl.java:154) at org.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564) at org.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:559) at org.jboss.netty.channel.Channels.fireMessageReceived(Channels.java:268) at org.jboss.netty.channel.Channels.fireMessageReceived(Channels.java:255) at org.jboss.netty.channel.socket.nio.NioWorker.read(NioWorker.java:88) at org.jboss.netty.channel.socket.nio.AbstractNioWorker.process(AbstractNioWorker.java:108) at org.jboss.netty.channel.socket.nio.AbstractNioSelector.run(AbstractNioSelector.java:337) at org.jboss.netty.channel.socket.nio.AbstractNioWorker.run(AbstractNioWorker.java:89) at org.jboss.netty.channel.socket.nio.NioWorker.run(NioWorker.java:178) at org.jboss.netty.util.ThreadRenamingRunnable.run(ThreadRenamingRunnable.java:108) at org.jboss.netty.util.internal.DeadLockProofWorker$1.run(DeadLockProofWorker.java:42) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748)

*************************************

may I know what could be cause of huge continuous logs generation even though putElasticsearch processor is paused.

Don't have an account?
Coming from Hortonworks? Activate your account here