13 Jul 2017 14:20:54,491 INFO [lifecycleSupervisor-1-0] (org.apache.flume.node.PollingPropertiesFileConfigurationProvider.start:61) - Configuration provider starting 13 Jul 2017 14:20:54,499 INFO [conf-file-poller-0] (org.apache.flume.node.PollingPropertiesFileConfigurationProvider$FileWatcherRunnable.run:133) - Reloading configuration file:/usr/hdp/current/flume-server/conf/demo/flume.conf 13 Jul 2017 14:20:54,504 INFO [conf-file-poller-0] (org.apache.flume.conf.FlumeConfiguration$AgentConfiguration.addProperty:1016) - Processing:sink 13 Jul 2017 14:20:54,505 INFO [conf-file-poller-0] (org.apache.flume.conf.FlumeConfiguration$AgentConfiguration.addProperty:1016) - Processing:sink 13 Jul 2017 14:20:54,506 INFO [conf-file-poller-0] (org.apache.flume.conf.FlumeConfiguration$AgentConfiguration.addProperty:1016) - Processing:sink 13 Jul 2017 14:20:54,506 INFO [conf-file-poller-0] (org.apache.flume.conf.FlumeConfiguration$AgentConfiguration.addProperty:1016) - Processing:sink 13 Jul 2017 14:20:54,506 INFO [conf-file-poller-0] (org.apache.flume.conf.FlumeConfiguration$AgentConfiguration.addProperty:1016) - Processing:sink 13 Jul 2017 14:20:54,506 INFO [conf-file-poller-0] (org.apache.flume.conf.FlumeConfiguration$AgentConfiguration.addProperty:1016) - Processing:sink 13 Jul 2017 14:20:54,507 INFO [conf-file-poller-0] (org.apache.flume.conf.FlumeConfiguration$AgentConfiguration.addProperty:1016) - Processing:sink 13 Jul 2017 14:20:54,507 INFO [conf-file-poller-0] (org.apache.flume.conf.FlumeConfiguration$AgentConfiguration.addProperty:1016) - Processing:sink 13 Jul 2017 14:20:54,507 INFO [conf-file-poller-0] (org.apache.flume.conf.FlumeConfiguration$AgentConfiguration.addProperty:1016) - Processing:sink 13 Jul 2017 14:20:54,507 INFO [conf-file-poller-0] (org.apache.flume.conf.FlumeConfiguration$AgentConfiguration.addProperty:930) - Added sinks: sink Agent: demo 13 Jul 2017 14:20:54,507 INFO [conf-file-poller-0] (org.apache.flume.conf.FlumeConfiguration$AgentConfiguration.addProperty:1016) - Processing:sink 13 Jul 2017 14:20:54,507 INFO [conf-file-poller-0] (org.apache.flume.conf.FlumeConfiguration$AgentConfiguration.addProperty:1016) - Processing:sink 13 Jul 2017 14:20:54,508 INFO [conf-file-poller-0] (org.apache.flume.conf.FlumeConfiguration$AgentConfiguration.addProperty:1016) - Processing:sink 13 Jul 2017 14:20:54,523 INFO [conf-file-poller-0] (org.apache.flume.conf.FlumeConfiguration.validateConfiguration:140) - Post-validation flume configuration contains configuration for agents: [demo] 13 Jul 2017 14:20:54,524 INFO [conf-file-poller-0] (org.apache.flume.node.AbstractConfigurationProvider.loadChannels:150) - Creating channels 13 Jul 2017 14:20:54,532 INFO [conf-file-poller-0] (org.apache.flume.channel.DefaultChannelFactory.create:40) - Creating instance of channel memory_channel type memory 13 Jul 2017 14:20:54,535 INFO [conf-file-poller-0] (org.apache.flume.node.AbstractConfigurationProvider.loadChannels:205) - Created channel memory_channel 13 Jul 2017 14:20:54,536 INFO [conf-file-poller-0] (org.apache.flume.source.DefaultSourceFactory.create:39) - Creating instance of source Netcat, type netcat 13 Jul 2017 14:20:54,542 INFO [conf-file-poller-0] (org.apache.flume.sink.DefaultSinkFactory.create:40) - Creating instance of sink: sink, type: org.apache.flume.sink.kafka.KafkaSink 13 Jul 2017 14:20:54,544 INFO [conf-file-poller-0] (org.apache.flume.sink.kafka.KafkaSink.configure:213) - Using the static topic: flume_topic this may be over-ridden by event headers 13 Jul 2017 14:20:54,544 INFO [conf-file-poller-0] (org.apache.flume.sink.kafka.KafkaSinkUtil.getKafkaProperties:34) - context={ parameters:{security.protocol=SASL_SSL, generateKeytabFor=flume/sandbox.hortonworks.com@HADOOP.COM, ssl.truststore.location=/etc/security/certificate/server.truststore.jks, sasl.mechanism=GSSAPI, channel=memory_channel, topic=flume_topic, bootstrap.servers=sandbox.hortonworks.com:6668,sandbox.hortonworks.com:6668, ssl.truststore.password=sasl_ssl, type=org.apache.flume.sink.kafka.KafkaSink, ssl.truststore.type=JKS, sasl.kerberos.service.name=kafka, brokerList=sandbox.hortonworks.com:6668,sandbox.hortonworks.com:6668} } 13 Jul 2017 14:20:54,555 INFO [conf-file-poller-0] (org.apache.flume.node.AbstractConfigurationProvider.getConfiguration:119) - Channel memory_channel connected to [Netcat, sink] 13 Jul 2017 14:20:54,559 INFO [conf-file-poller-0] (org.apache.flume.node.Application.startAllComponents:138) - Starting new configuration:{ sourceRunners:{Netcat=EventDrivenSourceRunner: { source:org.apache.flume.source.NetcatSource{name:Netcat,state:IDLE} }} sinkRunners:{sink=SinkRunner: { policy:org.apache.flume.sink.DefaultSinkProcessor@7a45af45 counterGroup:{ name:null counters:{} } }} channels:{memory_channel=org.apache.flume.channel.MemoryChannel{name: memory_channel}} } 13 Jul 2017 14:20:54,560 INFO [conf-file-poller-0] (org.apache.flume.node.Application.startAllComponents:145) - Starting Channel memory_channel 13 Jul 2017 14:20:54,561 INFO [conf-file-poller-0] (org.apache.flume.node.Application.startAllComponents:160) - Waiting for channel: memory_channel to start. Sleeping for 500 ms 13 Jul 2017 14:20:54,562 INFO [lifecycleSupervisor-1-2] (org.apache.flume.instrumentation.MonitoredCounterGroup.register:119) - Monitored counter group for type: CHANNEL, name: memory_channel: Successfully registered new MBean. 13 Jul 2017 14:20:54,562 INFO [lifecycleSupervisor-1-2] (org.apache.flume.instrumentation.MonitoredCounterGroup.start:95) - Component type: CHANNEL, name: memory_channel started 13 Jul 2017 14:20:55,064 INFO [conf-file-poller-0] (org.apache.flume.node.Application.startAllComponents:173) - Starting Sink sink 13 Jul 2017 14:20:55,065 INFO [conf-file-poller-0] (org.apache.flume.node.Application.startAllComponents:184) - Starting Source Netcat 13 Jul 2017 14:20:55,092 INFO [lifecycleSupervisor-1-1] (org.apache.flume.source.NetcatSource.start:150) - Source starting 13 Jul 2017 14:20:55,096 INFO [lifecycleSupervisor-1-1] (org.apache.flume.source.NetcatSource.start:164) - Created serverSocket:sun.nio.ch.ServerSocketChannelImpl[/127.0.0.1:56566] 13 Jul 2017 14:20:55,260 INFO [conf-file-poller-0] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink.configure:84) - Context parameters { parameters:{node=sandbox.hortonworks.com:, port=34545, type=org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink} } 13 Jul 2017 14:20:55,261 INFO [lifecycleSupervisor-1-0] (kafka.utils.Logging$class.info:70) - Verifying properties 13 Jul 2017 14:20:55,341 INFO [lifecycleSupervisor-1-0] (kafka.utils.Logging$class.info:70) - Property key.serializer.class is overridden to kafka.serializer.StringEncoder 13 Jul 2017 14:20:55,342 INFO [lifecycleSupervisor-1-0] (kafka.utils.Logging$class.info:70) - Property metadata.broker.list is overridden to sandbox.hortonworks.com:6668,sandbox.hortonworks.com:6668 13 Jul 2017 14:20:55,342 INFO [lifecycleSupervisor-1-0] (kafka.utils.Logging$class.info:70) - Property request.required.acks is overridden to 1 13 Jul 2017 14:20:55,342 INFO [lifecycleSupervisor-1-0] (kafka.utils.Logging$class.info:70) - Property serializer.class is overridden to kafka.serializer.DefaultEncoder 13 Jul 2017 14:20:55,469 INFO [lifecycleSupervisor-1-0] (org.apache.flume.instrumentation.MonitoredCounterGroup.register:119) - Monitored counter group for type: SINK, name: sink: Successfully registered new MBean. 13 Jul 2017 14:20:55,470 INFO [lifecycleSupervisor-1-0] (org.apache.flume.instrumentation.MonitoredCounterGroup.start:95) - Component type: SINK, name: sink started 13 Jul 2017 14:20:55,575 INFO [conf-file-poller-0] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink.start:67) - Starting Flume Metrics Sink 13 Jul 2017 14:20:55,585 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - ChannelCapacity = 1000 13 Jul 2017 14:20:55,587 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - ChannelFillPercentage = 0.0 13 Jul 2017 14:20:55,587 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - EventTakeSuccessCount = 0 13 Jul 2017 14:20:55,587 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - ChannelSize = 0 13 Jul 2017 14:20:55,587 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - EventTakeAttemptCount = 1 13 Jul 2017 14:20:55,587 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - StartTime = 1499948454562 13 Jul 2017 14:20:55,587 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - EventPutAttemptCount = 0 13 Jul 2017 14:20:55,587 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - EventPutSuccessCount = 0 13 Jul 2017 14:20:55,587 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - StopTime = 0 13 Jul 2017 14:20:55,587 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - ConnectionCreatedCount = 0 13 Jul 2017 14:20:55,588 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - BatchCompleteCount = 0 13 Jul 2017 14:20:55,588 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - EventDrainAttemptCount = 0 13 Jul 2017 14:20:55,588 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - BatchEmptyCount = 0 13 Jul 2017 14:20:55,588 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - StartTime = 1499948455470 13 Jul 2017 14:20:55,588 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - BatchUnderflowCount = 0 13 Jul 2017 14:20:55,588 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - ConnectionFailedCount = 0 13 Jul 2017 14:20:55,588 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - ConnectionClosedCount = 0 13 Jul 2017 14:20:55,588 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - RollbackCount = 0 13 Jul 2017 14:20:55,588 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - EventDrainSuccessCount = 0 13 Jul 2017 14:20:55,588 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - KafkaEventSendTimer = 0 13 Jul 2017 14:20:55,588 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - StopTime = 0 13 Jul 2017 14:21:05,596 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - ChannelCapacity = 1000 13 Jul 2017 14:21:05,596 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - ChannelFillPercentage = 0.0 13 Jul 2017 14:21:05,596 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - EventTakeSuccessCount = 0 13 Jul 2017 14:21:05,596 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - ChannelSize = 0 13 Jul 2017 14:21:05,596 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - EventTakeAttemptCount = 3 13 Jul 2017 14:21:05,596 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - StartTime = 1499948454562 13 Jul 2017 14:21:05,596 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - EventPutAttemptCount = 0 13 Jul 2017 14:21:05,596 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - EventPutSuccessCount = 0 13 Jul 2017 14:21:05,596 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - StopTime = 0 13 Jul 2017 14:21:05,596 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - ConnectionCreatedCount = 0 13 Jul 2017 14:21:05,596 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - BatchCompleteCount = 0 13 Jul 2017 14:21:05,597 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - EventDrainAttemptCount = 0 13 Jul 2017 14:21:05,597 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - BatchEmptyCount = 2 13 Jul 2017 14:21:05,597 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - StartTime = 1499948455470 13 Jul 2017 14:21:05,597 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - BatchUnderflowCount = 0 13 Jul 2017 14:21:05,597 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - ConnectionFailedCount = 0 13 Jul 2017 14:21:05,597 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - ConnectionClosedCount = 0 13 Jul 2017 14:21:05,597 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - RollbackCount = 0 13 Jul 2017 14:21:05,597 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - EventDrainSuccessCount = 0 13 Jul 2017 14:21:05,597 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - KafkaEventSendTimer = 0 13 Jul 2017 14:21:05,597 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - StopTime = 0 13 Jul 2017 14:21:15,599 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - ChannelCapacity = 1000 13 Jul 2017 14:21:15,599 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - ChannelFillPercentage = 0.0 13 Jul 2017 14:21:15,599 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - EventTakeSuccessCount = 0 13 Jul 2017 14:21:15,600 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - ChannelSize = 0 13 Jul 2017 14:21:15,600 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - EventTakeAttemptCount = 4 13 Jul 2017 14:21:15,600 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - StartTime = 1499948454562 13 Jul 2017 14:21:15,600 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - EventPutAttemptCount = 0 13 Jul 2017 14:21:15,600 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - EventPutSuccessCount = 0 13 Jul 2017 14:21:15,600 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - StopTime = 0 13 Jul 2017 14:21:15,600 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - ConnectionCreatedCount = 0 13 Jul 2017 14:21:15,600 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - BatchCompleteCount = 0 13 Jul 2017 14:21:15,600 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - EventDrainAttemptCount = 0 13 Jul 2017 14:21:15,600 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - BatchEmptyCount = 4 13 Jul 2017 14:21:15,600 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - StartTime = 1499948455470 13 Jul 2017 14:21:15,600 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - BatchUnderflowCount = 0 13 Jul 2017 14:21:15,601 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - ConnectionFailedCount = 0 13 Jul 2017 14:21:15,601 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - ConnectionClosedCount = 0 13 Jul 2017 14:21:15,601 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - RollbackCount = 0 13 Jul 2017 14:21:15,601 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - EventDrainSuccessCount = 0 13 Jul 2017 14:21:15,601 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - KafkaEventSendTimer = 0 13 Jul 2017 14:21:15,601 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - StopTime = 0 13 Jul 2017 14:21:25,606 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - ChannelCapacity = 1000 13 Jul 2017 14:21:25,606 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - ChannelFillPercentage = 0.1 13 Jul 2017 14:21:25,607 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - EventTakeSuccessCount = 0 13 Jul 2017 14:21:25,607 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - ChannelSize = 1 13 Jul 2017 14:21:25,608 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - EventTakeAttemptCount = 7 13 Jul 2017 14:21:25,608 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - StartTime = 1499948454562 13 Jul 2017 14:21:25,608 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - EventPutAttemptCount = 1 13 Jul 2017 14:21:25,609 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - EventPutSuccessCount = 1 13 Jul 2017 14:21:25,609 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - StopTime = 0 13 Jul 2017 14:21:25,610 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - ConnectionCreatedCount = 0 13 Jul 2017 14:21:25,610 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - BatchCompleteCount = 0 13 Jul 2017 14:21:25,610 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - EventDrainAttemptCount = 0 13 Jul 2017 14:21:25,611 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - BatchEmptyCount = 5 13 Jul 2017 14:21:25,611 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - StartTime = 1499948455470 13 Jul 2017 14:21:25,612 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - BatchUnderflowCount = 0 13 Jul 2017 14:21:25,612 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - ConnectionFailedCount = 0 13 Jul 2017 14:21:25,613 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - ConnectionClosedCount = 0 13 Jul 2017 14:21:25,613 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - RollbackCount = 0 13 Jul 2017 14:21:25,614 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - EventDrainSuccessCount = 0 13 Jul 2017 14:21:25,614 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - KafkaEventSendTimer = 0 13 Jul 2017 14:21:25,615 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - StopTime = 0 [root@sandbox ~]# cat /var/log/flume/flume-demo.log 13 Jul 2017 14:20:54,491 INFO [lifecycleSupervisor-1-0] (org.apache.flume.node.PollingPropertiesFileConfigurationProvider.start:61) - Configuration provider starting 13 Jul 2017 14:20:54,499 INFO [conf-file-poller-0] (org.apache.flume.node.PollingPropertiesFileConfigurationProvider$FileWatcherRunnable.run:133) - Reloading configuration file:/usr/hdp/current/flume-server/conf/demo/flume.conf 13 Jul 2017 14:20:54,504 INFO [conf-file-poller-0] (org.apache.flume.conf.FlumeConfiguration$AgentConfiguration.addProperty:1016) - Processing:sink 13 Jul 2017 14:20:54,505 INFO [conf-file-poller-0] (org.apache.flume.conf.FlumeConfiguration$AgentConfiguration.addProperty:1016) - Processing:sink 13 Jul 2017 14:20:54,506 INFO [conf-file-poller-0] (org.apache.flume.conf.FlumeConfiguration$AgentConfiguration.addProperty:1016) - Processing:sink 13 Jul 2017 14:20:54,506 INFO [conf-file-poller-0] (org.apache.flume.conf.FlumeConfiguration$AgentConfiguration.addProperty:1016) - Processing:sink 13 Jul 2017 14:20:54,506 INFO [conf-file-poller-0] (org.apache.flume.conf.FlumeConfiguration$AgentConfiguration.addProperty:1016) - Processing:sink 13 Jul 2017 14:20:54,506 INFO [conf-file-poller-0] (org.apache.flume.conf.FlumeConfiguration$AgentConfiguration.addProperty:1016) - Processing:sink 13 Jul 2017 14:20:54,507 INFO [conf-file-poller-0] (org.apache.flume.conf.FlumeConfiguration$AgentConfiguration.addProperty:1016) - Processing:sink 13 Jul 2017 14:20:54,507 INFO [conf-file-poller-0] (org.apache.flume.conf.FlumeConfiguration$AgentConfiguration.addProperty:1016) - Processing:sink 13 Jul 2017 14:20:54,507 INFO [conf-file-poller-0] (org.apache.flume.conf.FlumeConfiguration$AgentConfiguration.addProperty:1016) - Processing:sink 13 Jul 2017 14:20:54,507 INFO [conf-file-poller-0] (org.apache.flume.conf.FlumeConfiguration$AgentConfiguration.addProperty:930) - Added sinks: sink Agent: demo 13 Jul 2017 14:20:54,507 INFO [conf-file-poller-0] (org.apache.flume.conf.FlumeConfiguration$AgentConfiguration.addProperty:1016) - Processing:sink 13 Jul 2017 14:20:54,507 INFO [conf-file-poller-0] (org.apache.flume.conf.FlumeConfiguration$AgentConfiguration.addProperty:1016) - Processing:sink 13 Jul 2017 14:20:54,508 INFO [conf-file-poller-0] (org.apache.flume.conf.FlumeConfiguration$AgentConfiguration.addProperty:1016) - Processing:sink 13 Jul 2017 14:20:54,523 INFO [conf-file-poller-0] (org.apache.flume.conf.FlumeConfiguration.validateConfiguration:140) - Post-validation flume configuration contains configuration for agents: [demo] 13 Jul 2017 14:20:54,524 INFO [conf-file-poller-0] (org.apache.flume.node.AbstractConfigurationProvider.loadChannels:150) - Creating channels 13 Jul 2017 14:20:54,532 INFO [conf-file-poller-0] (org.apache.flume.channel.DefaultChannelFactory.create:40) - Creating instance of channel memory_channel type memory 13 Jul 2017 14:20:54,535 INFO [conf-file-poller-0] (org.apache.flume.node.AbstractConfigurationProvider.loadChannels:205) - Created channel memory_channel 13 Jul 2017 14:20:54,536 INFO [conf-file-poller-0] (org.apache.flume.source.DefaultSourceFactory.create:39) - Creating instance of source Netcat, type netcat 13 Jul 2017 14:20:54,542 INFO [conf-file-poller-0] (org.apache.flume.sink.DefaultSinkFactory.create:40) - Creating instance of sink: sink, type: org.apache.flume.sink.kafka.KafkaSink 13 Jul 2017 14:20:54,544 INFO [conf-file-poller-0] (org.apache.flume.sink.kafka.KafkaSink.configure:213) - Using the static topic: flume_topic this may be over-ridden by event headers 13 Jul 2017 14:20:54,544 INFO [conf-file-poller-0] (org.apache.flume.sink.kafka.KafkaSinkUtil.getKafkaProperties:34) - context={ parameters:{security.protocol=SASL_SSL, generateKeytabFor=flume/sandbox.hortonworks.com@HADOOP.COM, ssl.truststore.location=/etc/security/certificate/server.truststore.jks, sasl.mechanism=GSSAPI, channel=memory_channel, topic=flume_topic, bootstrap.servers=sandbox.hortonworks.com:6668,sandbox.hortonworks.com:6668, ssl.truststore.password=sasl_ssl, type=org.apache.flume.sink.kafka.KafkaSink, ssl.truststore.type=JKS, sasl.kerberos.service.name=kafka, brokerList=sandbox.hortonworks.com:6668,sandbox.hortonworks.com:6668} } 13 Jul 2017 14:20:54,555 INFO [conf-file-poller-0] (org.apache.flume.node.AbstractConfigurationProvider.getConfiguration:119) - Channel memory_channel connected to [Netcat, sink] 13 Jul 2017 14:20:54,559 INFO [conf-file-poller-0] (org.apache.flume.node.Application.startAllComponents:138) - Starting new configuration:{ sourceRunners:{Netcat=EventDrivenSourceRunner: { source:org.apache.flume.source.NetcatSource{name:Netcat,state:IDLE} }} sinkRunners:{sink=SinkRunner: { policy:org.apache.flume.sink.DefaultSinkProcessor@7a45af45 counterGroup:{ name:null counters:{} } }} channels:{memory_channel=org.apache.flume.channel.MemoryChannel{name: memory_channel}} } 13 Jul 2017 14:20:54,560 INFO [conf-file-poller-0] (org.apache.flume.node.Application.startAllComponents:145) - Starting Channel memory_channel 13 Jul 2017 14:20:54,561 INFO [conf-file-poller-0] (org.apache.flume.node.Application.startAllComponents:160) - Waiting for channel: memory_channel to start. Sleeping for 500 ms 13 Jul 2017 14:20:54,562 INFO [lifecycleSupervisor-1-2] (org.apache.flume.instrumentation.MonitoredCounterGroup.register:119) - Monitored counter group for type: CHANNEL, name: memory_channel: Successfully registered new MBean. 13 Jul 2017 14:20:54,562 INFO [lifecycleSupervisor-1-2] (org.apache.flume.instrumentation.MonitoredCounterGroup.start:95) - Component type: CHANNEL, name: memory_channel started 13 Jul 2017 14:20:55,064 INFO [conf-file-poller-0] (org.apache.flume.node.Application.startAllComponents:173) - Starting Sink sink 13 Jul 2017 14:20:55,065 INFO [conf-file-poller-0] (org.apache.flume.node.Application.startAllComponents:184) - Starting Source Netcat 13 Jul 2017 14:20:55,092 INFO [lifecycleSupervisor-1-1] (org.apache.flume.source.NetcatSource.start:150) - Source starting 13 Jul 2017 14:20:55,096 INFO [lifecycleSupervisor-1-1] (org.apache.flume.source.NetcatSource.start:164) - Created serverSocket:sun.nio.ch.ServerSocketChannelImpl[/127.0.0.1:56566] 13 Jul 2017 14:20:55,260 INFO [conf-file-poller-0] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink.configure:84) - Context parameters { parameters:{node=sandbox.hortonworks.com:, port=34545, type=org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink} } 13 Jul 2017 14:20:55,261 INFO [lifecycleSupervisor-1-0] (kafka.utils.Logging$class.info:70) - Verifying properties 13 Jul 2017 14:20:55,341 INFO [lifecycleSupervisor-1-0] (kafka.utils.Logging$class.info:70) - Property key.serializer.class is overridden to kafka.serializer.StringEncoder 13 Jul 2017 14:20:55,342 INFO [lifecycleSupervisor-1-0] (kafka.utils.Logging$class.info:70) - Property metadata.broker.list is overridden to sandbox.hortonworks.com:6668,sandbox.hortonworks.com:6668 13 Jul 2017 14:20:55,342 INFO [lifecycleSupervisor-1-0] (kafka.utils.Logging$class.info:70) - Property request.required.acks is overridden to 1 13 Jul 2017 14:20:55,342 INFO [lifecycleSupervisor-1-0] (kafka.utils.Logging$class.info:70) - Property serializer.class is overridden to kafka.serializer.DefaultEncoder 13 Jul 2017 14:20:55,469 INFO [lifecycleSupervisor-1-0] (org.apache.flume.instrumentation.MonitoredCounterGroup.register:119) - Monitored counter group for type: SINK, name: sink: Successfully registered new MBean. 13 Jul 2017 14:20:55,470 INFO [lifecycleSupervisor-1-0] (org.apache.flume.instrumentation.MonitoredCounterGroup.start:95) - Component type: SINK, name: sink started 13 Jul 2017 14:20:55,575 INFO [conf-file-poller-0] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink.start:67) - Starting Flume Metrics Sink 13 Jul 2017 14:20:55,585 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - ChannelCapacity = 1000 13 Jul 2017 14:20:55,587 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - ChannelFillPercentage = 0.0 13 Jul 2017 14:20:55,587 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - EventTakeSuccessCount = 0 13 Jul 2017 14:20:55,587 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - ChannelSize = 0 13 Jul 2017 14:20:55,587 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - EventTakeAttemptCount = 1 13 Jul 2017 14:20:55,587 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - StartTime = 1499948454562 13 Jul 2017 14:20:55,587 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - EventPutAttemptCount = 0 13 Jul 2017 14:20:55,587 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - EventPutSuccessCount = 0 13 Jul 2017 14:20:55,587 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - StopTime = 0 13 Jul 2017 14:20:55,587 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - ConnectionCreatedCount = 0 13 Jul 2017 14:20:55,588 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - BatchCompleteCount = 0 13 Jul 2017 14:20:55,588 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - EventDrainAttemptCount = 0 13 Jul 2017 14:20:55,588 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - BatchEmptyCount = 0 13 Jul 2017 14:20:55,588 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - StartTime = 1499948455470 13 Jul 2017 14:20:55,588 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - BatchUnderflowCount = 0 13 Jul 2017 14:20:55,588 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - ConnectionFailedCount = 0 13 Jul 2017 14:20:55,588 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - ConnectionClosedCount = 0 13 Jul 2017 14:20:55,588 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - RollbackCount = 0 13 Jul 2017 14:20:55,588 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - EventDrainSuccessCount = 0 13 Jul 2017 14:20:55,588 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - KafkaEventSendTimer = 0 13 Jul 2017 14:20:55,588 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - StopTime = 0 13 Jul 2017 14:21:05,596 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - ChannelCapacity = 1000 13 Jul 2017 14:21:05,596 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - ChannelFillPercentage = 0.0 13 Jul 2017 14:21:05,596 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - EventTakeSuccessCount = 0 13 Jul 2017 14:21:05,596 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - ChannelSize = 0 13 Jul 2017 14:21:05,596 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - EventTakeAttemptCount = 3 13 Jul 2017 14:21:05,596 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - StartTime = 1499948454562 13 Jul 2017 14:21:05,596 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - EventPutAttemptCount = 0 13 Jul 2017 14:21:05,596 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - EventPutSuccessCount = 0 13 Jul 2017 14:21:05,596 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - StopTime = 0 13 Jul 2017 14:21:05,596 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - ConnectionCreatedCount = 0 13 Jul 2017 14:21:05,596 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - BatchCompleteCount = 0 13 Jul 2017 14:21:05,597 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - EventDrainAttemptCount = 0 13 Jul 2017 14:21:05,597 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - BatchEmptyCount = 2 13 Jul 2017 14:21:05,597 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - StartTime = 1499948455470 13 Jul 2017 14:21:05,597 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - BatchUnderflowCount = 0 13 Jul 2017 14:21:05,597 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - ConnectionFailedCount = 0 13 Jul 2017 14:21:05,597 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - ConnectionClosedCount = 0 13 Jul 2017 14:21:05,597 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - RollbackCount = 0 13 Jul 2017 14:21:05,597 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - EventDrainSuccessCount = 0 13 Jul 2017 14:21:05,597 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - KafkaEventSendTimer = 0 13 Jul 2017 14:21:05,597 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - StopTime = 0 13 Jul 2017 14:21:15,599 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - ChannelCapacity = 1000 13 Jul 2017 14:21:15,599 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - ChannelFillPercentage = 0.0 13 Jul 2017 14:21:15,599 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - EventTakeSuccessCount = 0 13 Jul 2017 14:21:15,600 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - ChannelSize = 0 13 Jul 2017 14:21:15,600 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - EventTakeAttemptCount = 4 13 Jul 2017 14:21:15,600 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - StartTime = 1499948454562 13 Jul 2017 14:21:15,600 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - EventPutAttemptCount = 0 13 Jul 2017 14:21:15,600 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - EventPutSuccessCount = 0 13 Jul 2017 14:21:15,600 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - StopTime = 0 13 Jul 2017 14:21:15,600 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - ConnectionCreatedCount = 0 13 Jul 2017 14:21:15,600 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - BatchCompleteCount = 0 13 Jul 2017 14:21:15,600 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - EventDrainAttemptCount = 0 13 Jul 2017 14:21:15,600 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - BatchEmptyCount = 4 13 Jul 2017 14:21:15,600 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - StartTime = 1499948455470 13 Jul 2017 14:21:15,600 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - BatchUnderflowCount = 0 13 Jul 2017 14:21:15,601 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - ConnectionFailedCount = 0 13 Jul 2017 14:21:15,601 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - ConnectionClosedCount = 0 13 Jul 2017 14:21:15,601 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - RollbackCount = 0 13 Jul 2017 14:21:15,601 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - EventDrainSuccessCount = 0 13 Jul 2017 14:21:15,601 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - KafkaEventSendTimer = 0 13 Jul 2017 14:21:15,601 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - StopTime = 0 13 Jul 2017 14:21:25,606 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - ChannelCapacity = 1000 13 Jul 2017 14:21:25,606 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - ChannelFillPercentage = 0.1 13 Jul 2017 14:21:25,607 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - EventTakeSuccessCount = 0 13 Jul 2017 14:21:25,607 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - ChannelSize = 1 13 Jul 2017 14:21:25,608 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - EventTakeAttemptCount = 7 13 Jul 2017 14:21:25,608 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - StartTime = 1499948454562 13 Jul 2017 14:21:25,608 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - EventPutAttemptCount = 1 13 Jul 2017 14:21:25,609 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - EventPutSuccessCount = 1 13 Jul 2017 14:21:25,609 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - StopTime = 0 13 Jul 2017 14:21:25,610 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - ConnectionCreatedCount = 0 13 Jul 2017 14:21:25,610 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - BatchCompleteCount = 0 13 Jul 2017 14:21:25,610 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - EventDrainAttemptCount = 0 13 Jul 2017 14:21:25,611 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - BatchEmptyCount = 5 13 Jul 2017 14:21:25,611 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - StartTime = 1499948455470 13 Jul 2017 14:21:25,612 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - BatchUnderflowCount = 0 13 Jul 2017 14:21:25,612 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - ConnectionFailedCount = 0 13 Jul 2017 14:21:25,613 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - ConnectionClosedCount = 0 13 Jul 2017 14:21:25,613 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - RollbackCount = 0 13 Jul 2017 14:21:25,614 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - EventDrainSuccessCount = 0 13 Jul 2017 14:21:25,614 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - KafkaEventSendTimer = 0 13 Jul 2017 14:21:25,615 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - StopTime = 0 [root@sandbox ~]# cat /var/log/flume/flume-demo.log 13 Jul 2017 14:20:54,491 INFO [lifecycleSupervisor-1-0] (org.apache.flume.node.PollingPropertiesFileConfigurationProvider.start:61) - Configuration provider starting 13 Jul 2017 14:20:54,499 INFO [conf-file-poller-0] (org.apache.flume.node.PollingPropertiesFileConfigurationProvider$FileWatcherRunnable.run:133) - Reloading configuration file:/usr/hdp/current/flume-server/conf/demo/flume.conf 13 Jul 2017 14:20:54,504 INFO [conf-file-poller-0] (org.apache.flume.conf.FlumeConfiguration$AgentConfiguration.addProperty:1016) - Processing:sink 13 Jul 2017 14:20:54,505 INFO [conf-file-poller-0] (org.apache.flume.conf.FlumeConfiguration$AgentConfiguration.addProperty:1016) - Processing:sink 13 Jul 2017 14:20:54,506 INFO [conf-file-poller-0] (org.apache.flume.conf.FlumeConfiguration$AgentConfiguration.addProperty:1016) - Processing:sink 13 Jul 2017 14:20:54,506 INFO [conf-file-poller-0] (org.apache.flume.conf.FlumeConfiguration$AgentConfiguration.addProperty:1016) - Processing:sink 13 Jul 2017 14:20:54,506 INFO [conf-file-poller-0] (org.apache.flume.conf.FlumeConfiguration$AgentConfiguration.addProperty:1016) - Processing:sink 13 Jul 2017 14:20:54,506 INFO [conf-file-poller-0] (org.apache.flume.conf.FlumeConfiguration$AgentConfiguration.addProperty:1016) - Processing:sink 13 Jul 2017 14:20:54,507 INFO [conf-file-poller-0] (org.apache.flume.conf.FlumeConfiguration$AgentConfiguration.addProperty:1016) - Processing:sink 13 Jul 2017 14:20:54,507 INFO [conf-file-poller-0] (org.apache.flume.conf.FlumeConfiguration$AgentConfiguration.addProperty:1016) - Processing:sink 13 Jul 2017 14:20:54,507 INFO [conf-file-poller-0] (org.apache.flume.conf.FlumeConfiguration$AgentConfiguration.addProperty:1016) - Processing:sink 13 Jul 2017 14:20:54,507 INFO [conf-file-poller-0] (org.apache.flume.conf.FlumeConfiguration$AgentConfiguration.addProperty:930) - Added sinks: sink Agent: demo 13 Jul 2017 14:20:54,507 INFO [conf-file-poller-0] (org.apache.flume.conf.FlumeConfiguration$AgentConfiguration.addProperty:1016) - Processing:sink 13 Jul 2017 14:20:54,507 INFO [conf-file-poller-0] (org.apache.flume.conf.FlumeConfiguration$AgentConfiguration.addProperty:1016) - Processing:sink 13 Jul 2017 14:20:54,508 INFO [conf-file-poller-0] (org.apache.flume.conf.FlumeConfiguration$AgentConfiguration.addProperty:1016) - Processing:sink 13 Jul 2017 14:20:54,523 INFO [conf-file-poller-0] (org.apache.flume.conf.FlumeConfiguration.validateConfiguration:140) - Post-validation flume configuration contains configuration for agents: [demo] 13 Jul 2017 14:20:54,524 INFO [conf-file-poller-0] (org.apache.flume.node.AbstractConfigurationProvider.loadChannels:150) - Creating channels 13 Jul 2017 14:20:54,532 INFO [conf-file-poller-0] (org.apache.flume.channel.DefaultChannelFactory.create:40) - Creating instance of channel memory_channel type memory 13 Jul 2017 14:20:54,535 INFO [conf-file-poller-0] (org.apache.flume.node.AbstractConfigurationProvider.loadChannels:205) - Created channel memory_channel 13 Jul 2017 14:20:54,536 INFO [conf-file-poller-0] (org.apache.flume.source.DefaultSourceFactory.create:39) - Creating instance of source Netcat, type netcat 13 Jul 2017 14:20:54,542 INFO [conf-file-poller-0] (org.apache.flume.sink.DefaultSinkFactory.create:40) - Creating instance of sink: sink, type: org.apache.flume.sink.kafka.KafkaSink 13 Jul 2017 14:20:54,544 INFO [conf-file-poller-0] (org.apache.flume.sink.kafka.KafkaSink.configure:213) - Using the static topic: flume_topic this may be over-ridden by event headers 13 Jul 2017 14:20:54,544 INFO [conf-file-poller-0] (org.apache.flume.sink.kafka.KafkaSinkUtil.getKafkaProperties:34) - context={ parameters:{security.protocol=SASL_SSL, generateKeytabFor=flume/sandbox.hortonworks.com@HADOOP.COM, ssl.truststore.location=/etc/security/certificate/server.truststore.jks, sasl.mechanism=GSSAPI, channel=memory_channel, topic=flume_topic, bootstrap.servers=sandbox.hortonworks.com:6668,sandbox.hortonworks.com:6668, ssl.truststore.password=sasl_ssl, type=org.apache.flume.sink.kafka.KafkaSink, ssl.truststore.type=JKS, sasl.kerberos.service.name=kafka, brokerList=sandbox.hortonworks.com:6668,sandbox.hortonworks.com:6668} } 13 Jul 2017 14:20:54,555 INFO [conf-file-poller-0] (org.apache.flume.node.AbstractConfigurationProvider.getConfiguration:119) - Channel memory_channel connected to [Netcat, sink] 13 Jul 2017 14:20:54,559 INFO [conf-file-poller-0] (org.apache.flume.node.Application.startAllComponents:138) - Starting new configuration:{ sourceRunners:{Netcat=EventDrivenSourceRunner: { source:org.apache.flume.source.NetcatSource{name:Netcat,state:IDLE} }} sinkRunners:{sink=SinkRunner: { policy:org.apache.flume.sink.DefaultSinkProcessor@7a45af45 counterGroup:{ name:null counters:{} } }} channels:{memory_channel=org.apache.flume.channel.MemoryChannel{name: memory_channel}} } 13 Jul 2017 14:20:54,560 INFO [conf-file-poller-0] (org.apache.flume.node.Application.startAllComponents:145) - Starting Channel memory_channel 13 Jul 2017 14:20:54,561 INFO [conf-file-poller-0] (org.apache.flume.node.Application.startAllComponents:160) - Waiting for channel: memory_channel to start. Sleeping for 500 ms 13 Jul 2017 14:20:54,562 INFO [lifecycleSupervisor-1-2] (org.apache.flume.instrumentation.MonitoredCounterGroup.register:119) - Monitored counter group for type: CHANNEL, name: memory_channel: Successfully registered new MBean. 13 Jul 2017 14:20:54,562 INFO [lifecycleSupervisor-1-2] (org.apache.flume.instrumentation.MonitoredCounterGroup.start:95) - Component type: CHANNEL, name: memory_channel started 13 Jul 2017 14:20:55,064 INFO [conf-file-poller-0] (org.apache.flume.node.Application.startAllComponents:173) - Starting Sink sink 13 Jul 2017 14:20:55,065 INFO [conf-file-poller-0] (org.apache.flume.node.Application.startAllComponents:184) - Starting Source Netcat 13 Jul 2017 14:20:55,092 INFO [lifecycleSupervisor-1-1] (org.apache.flume.source.NetcatSource.start:150) - Source starting 13 Jul 2017 14:20:55,096 INFO [lifecycleSupervisor-1-1] (org.apache.flume.source.NetcatSource.start:164) - Created serverSocket:sun.nio.ch.ServerSocketChannelImpl[/127.0.0.1:56566] 13 Jul 2017 14:20:55,260 INFO [conf-file-poller-0] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink.configure:84) - Context parameters { parameters:{node=sandbox.hortonworks.com:, port=34545, type=org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink} } 13 Jul 2017 14:20:55,261 INFO [lifecycleSupervisor-1-0] (kafka.utils.Logging$class.info:70) - Verifying properties 13 Jul 2017 14:20:55,341 INFO [lifecycleSupervisor-1-0] (kafka.utils.Logging$class.info:70) - Property key.serializer.class is overridden to kafka.serializer.StringEncoder 13 Jul 2017 14:20:55,342 INFO [lifecycleSupervisor-1-0] (kafka.utils.Logging$class.info:70) - Property metadata.broker.list is overridden to sandbox.hortonworks.com:6668,sandbox.hortonworks.com:6668 13 Jul 2017 14:20:55,342 INFO [lifecycleSupervisor-1-0] (kafka.utils.Logging$class.info:70) - Property request.required.acks is overridden to 1 13 Jul 2017 14:20:55,342 INFO [lifecycleSupervisor-1-0] (kafka.utils.Logging$class.info:70) - Property serializer.class is overridden to kafka.serializer.DefaultEncoder 13 Jul 2017 14:20:55,469 INFO [lifecycleSupervisor-1-0] (org.apache.flume.instrumentation.MonitoredCounterGroup.register:119) - Monitored counter group for type: SINK, name: sink: Successfully registered new MBean. 13 Jul 2017 14:20:55,470 INFO [lifecycleSupervisor-1-0] (org.apache.flume.instrumentation.MonitoredCounterGroup.start:95) - Component type: SINK, name: sink started 13 Jul 2017 14:20:55,575 INFO [conf-file-poller-0] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink.start:67) - Starting Flume Metrics Sink 13 Jul 2017 14:20:55,585 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - ChannelCapacity = 1000 13 Jul 2017 14:20:55,587 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - ChannelFillPercentage = 0.0 13 Jul 2017 14:20:55,587 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - EventTakeSuccessCount = 0 13 Jul 2017 14:20:55,587 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - ChannelSize = 0 13 Jul 2017 14:20:55,587 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - EventTakeAttemptCount = 1 13 Jul 2017 14:20:55,587 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - StartTime = 1499948454562 13 Jul 2017 14:20:55,587 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - EventPutAttemptCount = 0 13 Jul 2017 14:20:55,587 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - EventPutSuccessCount = 0 13 Jul 2017 14:20:55,587 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - StopTime = 0 13 Jul 2017 14:20:55,587 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - ConnectionCreatedCount = 0 13 Jul 2017 14:20:55,588 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - BatchCompleteCount = 0 13 Jul 2017 14:20:55,588 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - EventDrainAttemptCount = 0 13 Jul 2017 14:20:55,588 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - BatchEmptyCount = 0 13 Jul 2017 14:20:55,588 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - StartTime = 1499948455470 13 Jul 2017 14:20:55,588 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - BatchUnderflowCount = 0 13 Jul 2017 14:20:55,588 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - ConnectionFailedCount = 0 13 Jul 2017 14:20:55,588 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - ConnectionClosedCount = 0 13 Jul 2017 14:20:55,588 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - RollbackCount = 0 13 Jul 2017 14:20:55,588 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - EventDrainSuccessCount = 0 13 Jul 2017 14:20:55,588 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - KafkaEventSendTimer = 0 13 Jul 2017 14:20:55,588 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - StopTime = 0 13 Jul 2017 14:21:05,596 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - ChannelCapacity = 1000 13 Jul 2017 14:21:05,596 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - ChannelFillPercentage = 0.0 13 Jul 2017 14:21:05,596 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - EventTakeSuccessCount = 0 13 Jul 2017 14:21:05,596 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - ChannelSize = 0 13 Jul 2017 14:21:05,596 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - EventTakeAttemptCount = 3 13 Jul 2017 14:21:05,596 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - StartTime = 1499948454562 13 Jul 2017 14:21:05,596 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - EventPutAttemptCount = 0 13 Jul 2017 14:21:05,596 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - EventPutSuccessCount = 0 13 Jul 2017 14:21:05,596 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - StopTime = 0 13 Jul 2017 14:21:05,596 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - ConnectionCreatedCount = 0 13 Jul 2017 14:21:05,596 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - BatchCompleteCount = 0 13 Jul 2017 14:21:05,597 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - EventDrainAttemptCount = 0 13 Jul 2017 14:21:05,597 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - BatchEmptyCount = 2 13 Jul 2017 14:21:05,597 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - StartTime = 1499948455470 13 Jul 2017 14:21:05,597 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - BatchUnderflowCount = 0 13 Jul 2017 14:21:05,597 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - ConnectionFailedCount = 0 13 Jul 2017 14:21:05,597 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - ConnectionClosedCount = 0 13 Jul 2017 14:21:05,597 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - RollbackCount = 0 13 Jul 2017 14:21:05,597 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - EventDrainSuccessCount = 0 13 Jul 2017 14:21:05,597 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - KafkaEventSendTimer = 0 13 Jul 2017 14:21:05,597 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - StopTime = 0 13 Jul 2017 14:21:15,599 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - ChannelCapacity = 1000 13 Jul 2017 14:21:15,599 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - ChannelFillPercentage = 0.0 13 Jul 2017 14:21:15,599 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - EventTakeSuccessCount = 0 13 Jul 2017 14:21:15,600 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - ChannelSize = 0 13 Jul 2017 14:21:15,600 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - EventTakeAttemptCount = 4 13 Jul 2017 14:21:15,600 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - StartTime = 1499948454562 13 Jul 2017 14:21:15,600 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - EventPutAttemptCount = 0 13 Jul 2017 14:21:15,600 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - EventPutSuccessCount = 0 13 Jul 2017 14:21:15,600 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - StopTime = 0 13 Jul 2017 14:21:15,600 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - ConnectionCreatedCount = 0 13 Jul 2017 14:21:15,600 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - BatchCompleteCount = 0 13 Jul 2017 14:21:15,600 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - EventDrainAttemptCount = 0 13 Jul 2017 14:21:15,600 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - BatchEmptyCount = 4 13 Jul 2017 14:21:15,600 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - StartTime = 1499948455470 13 Jul 2017 14:21:15,600 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - BatchUnderflowCount = 0 13 Jul 2017 14:21:15,601 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - ConnectionFailedCount = 0 13 Jul 2017 14:21:15,601 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - ConnectionClosedCount = 0 13 Jul 2017 14:21:15,601 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - RollbackCount = 0 13 Jul 2017 14:21:15,601 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - EventDrainSuccessCount = 0 13 Jul 2017 14:21:15,601 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - KafkaEventSendTimer = 0 13 Jul 2017 14:21:15,601 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - StopTime = 0 13 Jul 2017 14:21:25,606 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - ChannelCapacity = 1000 13 Jul 2017 14:21:25,606 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - ChannelFillPercentage = 0.1 13 Jul 2017 14:21:25,607 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - EventTakeSuccessCount = 0 13 Jul 2017 14:21:25,607 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - ChannelSize = 1 13 Jul 2017 14:21:25,608 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - EventTakeAttemptCount = 7 13 Jul 2017 14:21:25,608 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - StartTime = 1499948454562 13 Jul 2017 14:21:25,608 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - EventPutAttemptCount = 1 13 Jul 2017 14:21:25,609 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - EventPutSuccessCount = 1 13 Jul 2017 14:21:25,609 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - StopTime = 0 13 Jul 2017 14:21:25,610 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - ConnectionCreatedCount = 0 13 Jul 2017 14:21:25,610 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - BatchCompleteCount = 0 13 Jul 2017 14:21:25,610 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - EventDrainAttemptCount = 0 13 Jul 2017 14:21:25,611 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - BatchEmptyCount = 5 13 Jul 2017 14:21:25,611 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - StartTime = 1499948455470 13 Jul 2017 14:21:25,612 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - BatchUnderflowCount = 0 13 Jul 2017 14:21:25,612 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - ConnectionFailedCount = 0 13 Jul 2017 14:21:25,613 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - ConnectionClosedCount = 0 13 Jul 2017 14:21:25,613 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - RollbackCount = 0 13 Jul 2017 14:21:25,614 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - EventDrainSuccessCount = 0 13 Jul 2017 14:21:25,614 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - KafkaEventSendTimer = 0 13 Jul 2017 14:21:25,615 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - StopTime = 0 [root@sandbox ~]# cat /var/log/flume/flume-demo.log 13 Jul 2017 14:20:54,491 INFO [lifecycleSupervisor-1-0] (org.apache.flume.node.PollingPropertiesFileConfigurationProvider.start:61) - Configuration provider starting 13 Jul 2017 14:20:54,499 INFO [conf-file-poller-0] (org.apache.flume.node.PollingPropertiesFileConfigurationProvider$FileWatcherRunnable.run:133) - Reloading configuration file:/usr/hdp/current/flume-server/conf/demo/flume.conf 13 Jul 2017 14:20:54,504 INFO [conf-file-poller-0] (org.apache.flume.conf.FlumeConfiguration$AgentConfiguration.addProperty:1016) - Processing:sink 13 Jul 2017 14:20:54,505 INFO [conf-file-poller-0] (org.apache.flume.conf.FlumeConfiguration$AgentConfiguration.addProperty:1016) - Processing:sink 13 Jul 2017 14:20:54,506 INFO [conf-file-poller-0] (org.apache.flume.conf.FlumeConfiguration$AgentConfiguration.addProperty:1016) - Processing:sink 13 Jul 2017 14:20:54,506 INFO [conf-file-poller-0] (org.apache.flume.conf.FlumeConfiguration$AgentConfiguration.addProperty:1016) - Processing:sink 13 Jul 2017 14:20:54,506 INFO [conf-file-poller-0] (org.apache.flume.conf.FlumeConfiguration$AgentConfiguration.addProperty:1016) - Processing:sink 13 Jul 2017 14:20:54,506 INFO [conf-file-poller-0] (org.apache.flume.conf.FlumeConfiguration$AgentConfiguration.addProperty:1016) - Processing:sink 13 Jul 2017 14:20:54,507 INFO [conf-file-poller-0] (org.apache.flume.conf.FlumeConfiguration$AgentConfiguration.addProperty:1016) - Processing:sink 13 Jul 2017 14:20:54,507 INFO [conf-file-poller-0] (org.apache.flume.conf.FlumeConfiguration$AgentConfiguration.addProperty:1016) - Processing:sink 13 Jul 2017 14:20:54,507 INFO [conf-file-poller-0] (org.apache.flume.conf.FlumeConfiguration$AgentConfiguration.addProperty:1016) - Processing:sink 13 Jul 2017 14:20:54,507 INFO [conf-file-poller-0] (org.apache.flume.conf.FlumeConfiguration$AgentConfiguration.addProperty:930) - Added sinks: sink Agent: demo 13 Jul 2017 14:20:54,507 INFO [conf-file-poller-0] (org.apache.flume.conf.FlumeConfiguration$AgentConfiguration.addProperty:1016) - Processing:sink 13 Jul 2017 14:20:54,507 INFO [conf-file-poller-0] (org.apache.flume.conf.FlumeConfiguration$AgentConfiguration.addProperty:1016) - Processing:sink 13 Jul 2017 14:20:54,508 INFO [conf-file-poller-0] (org.apache.flume.conf.FlumeConfiguration$AgentConfiguration.addProperty:1016) - Processing:sink 13 Jul 2017 14:20:54,523 INFO [conf-file-poller-0] (org.apache.flume.conf.FlumeConfiguration.validateConfiguration:140) - Post-validation flume configuration contains configuration for agents: [demo] 13 Jul 2017 14:20:54,524 INFO [conf-file-poller-0] (org.apache.flume.node.AbstractConfigurationProvider.loadChannels:150) - Creating channels 13 Jul 2017 14:20:54,532 INFO [conf-file-poller-0] (org.apache.flume.channel.DefaultChannelFactory.create:40) - Creating instance of channel memory_channel type memory 13 Jul 2017 14:20:54,535 INFO [conf-file-poller-0] (org.apache.flume.node.AbstractConfigurationProvider.loadChannels:205) - Created channel memory_channel 13 Jul 2017 14:20:54,536 INFO [conf-file-poller-0] (org.apache.flume.source.DefaultSourceFactory.create:39) - Creating instance of source Netcat, type netcat 13 Jul 2017 14:20:54,542 INFO [conf-file-poller-0] (org.apache.flume.sink.DefaultSinkFactory.create:40) - Creating instance of sink: sink, type: org.apache.flume.sink.kafka.KafkaSink 13 Jul 2017 14:20:54,544 INFO [conf-file-poller-0] (org.apache.flume.sink.kafka.KafkaSink.configure:213) - Using the static topic: flume_topic this may be over-ridden by event headers 13 Jul 2017 14:20:54,544 INFO [conf-file-poller-0] (org.apache.flume.sink.kafka.KafkaSinkUtil.getKafkaProperties:34) - context={ parameters:{security.protocol=SASL_SSL, generateKeytabFor=flume/sandbox.hortonworks.com@HADOOP.COM, ssl.truststore.location=/etc/security/certificate/server.truststore.jks, sasl.mechanism=GSSAPI, channel=memory_channel, topic=flume_topic, bootstrap.servers=sandbox.hortonworks.com:6668,sandbox.hortonworks.com:6668, ssl.truststore.password=sasl_ssl, type=org.apache.flume.sink.kafka.KafkaSink, ssl.truststore.type=JKS, sasl.kerberos.service.name=kafka, brokerList=sandbox.hortonworks.com:6668,sandbox.hortonworks.com:6668} } 13 Jul 2017 14:20:54,555 INFO [conf-file-poller-0] (org.apache.flume.node.AbstractConfigurationProvider.getConfiguration:119) - Channel memory_channel connected to [Netcat, sink] 13 Jul 2017 14:20:54,559 INFO [conf-file-poller-0] (org.apache.flume.node.Application.startAllComponents:138) - Starting new configuration:{ sourceRunners:{Netcat=EventDrivenSourceRunner: { source:org.apache.flume.source.NetcatSource{name:Netcat,state:IDLE} }} sinkRunners:{sink=SinkRunner: { policy:org.apache.flume.sink.DefaultSinkProcessor@7a45af45 counterGroup:{ name:null counters:{} } }} channels:{memory_channel=org.apache.flume.channel.MemoryChannel{name: memory_channel}} } 13 Jul 2017 14:20:54,560 INFO [conf-file-poller-0] (org.apache.flume.node.Application.startAllComponents:145) - Starting Channel memory_channel 13 Jul 2017 14:20:54,561 INFO [conf-file-poller-0] (org.apache.flume.node.Application.startAllComponents:160) - Waiting for channel: memory_channel to start. Sleeping for 500 ms 13 Jul 2017 14:20:54,562 INFO [lifecycleSupervisor-1-2] (org.apache.flume.instrumentation.MonitoredCounterGroup.register:119) - Monitored counter group for type: CHANNEL, name: memory_channel: Successfully registered new MBean. 13 Jul 2017 14:20:54,562 INFO [lifecycleSupervisor-1-2] (org.apache.flume.instrumentation.MonitoredCounterGroup.start:95) - Component type: CHANNEL, name: memory_channel started 13 Jul 2017 14:20:55,064 INFO [conf-file-poller-0] (org.apache.flume.node.Application.startAllComponents:173) - Starting Sink sink 13 Jul 2017 14:20:55,065 INFO [conf-file-poller-0] (org.apache.flume.node.Application.startAllComponents:184) - Starting Source Netcat 13 Jul 2017 14:20:55,092 INFO [lifecycleSupervisor-1-1] (org.apache.flume.source.NetcatSource.start:150) - Source starting 13 Jul 2017 14:20:55,096 INFO [lifecycleSupervisor-1-1] (org.apache.flume.source.NetcatSource.start:164) - Created serverSocket:sun.nio.ch.ServerSocketChannelImpl[/127.0.0.1:56566] 13 Jul 2017 14:20:55,260 INFO [conf-file-poller-0] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink.configure:84) - Context parameters { parameters:{node=sandbox.hortonworks.com:, port=34545, type=org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink} } 13 Jul 2017 14:20:55,261 INFO [lifecycleSupervisor-1-0] (kafka.utils.Logging$class.info:70) - Verifying properties 13 Jul 2017 14:20:55,341 INFO [lifecycleSupervisor-1-0] (kafka.utils.Logging$class.info:70) - Property key.serializer.class is overridden to kafka.serializer.StringEncoder 13 Jul 2017 14:20:55,342 INFO [lifecycleSupervisor-1-0] (kafka.utils.Logging$class.info:70) - Property metadata.broker.list is overridden to sandbox.hortonworks.com:6668,sandbox.hortonworks.com:6668 13 Jul 2017 14:20:55,342 INFO [lifecycleSupervisor-1-0] (kafka.utils.Logging$class.info:70) - Property request.required.acks is overridden to 1 13 Jul 2017 14:20:55,342 INFO [lifecycleSupervisor-1-0] (kafka.utils.Logging$class.info:70) - Property serializer.class is overridden to kafka.serializer.DefaultEncoder 13 Jul 2017 14:20:55,469 INFO [lifecycleSupervisor-1-0] (org.apache.flume.instrumentation.MonitoredCounterGroup.register:119) - Monitored counter group for type: SINK, name: sink: Successfully registered new MBean. 13 Jul 2017 14:20:55,470 INFO [lifecycleSupervisor-1-0] (org.apache.flume.instrumentation.MonitoredCounterGroup.start:95) - Component type: SINK, name: sink started 13 Jul 2017 14:20:55,575 INFO [conf-file-poller-0] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink.start:67) - Starting Flume Metrics Sink 13 Jul 2017 14:20:55,585 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - ChannelCapacity = 1000 13 Jul 2017 14:20:55,587 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - ChannelFillPercentage = 0.0 13 Jul 2017 14:20:55,587 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - EventTakeSuccessCount = 0 13 Jul 2017 14:20:55,587 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - ChannelSize = 0 13 Jul 2017 14:20:55,587 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - EventTakeAttemptCount = 1 13 Jul 2017 14:20:55,587 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - StartTime = 1499948454562 13 Jul 2017 14:20:55,587 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - EventPutAttemptCount = 0 13 Jul 2017 14:20:55,587 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - EventPutSuccessCount = 0 13 Jul 2017 14:20:55,587 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - StopTime = 0 13 Jul 2017 14:20:55,587 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - ConnectionCreatedCount = 0 13 Jul 2017 14:20:55,588 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - BatchCompleteCount = 0 13 Jul 2017 14:20:55,588 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - EventDrainAttemptCount = 0 13 Jul 2017 14:20:55,588 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - BatchEmptyCount = 0 13 Jul 2017 14:20:55,588 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - StartTime = 1499948455470 13 Jul 2017 14:20:55,588 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - BatchUnderflowCount = 0 13 Jul 2017 14:20:55,588 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - ConnectionFailedCount = 0 13 Jul 2017 14:20:55,588 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - ConnectionClosedCount = 0 13 Jul 2017 14:20:55,588 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - RollbackCount = 0 13 Jul 2017 14:20:55,588 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - EventDrainSuccessCount = 0 13 Jul 2017 14:20:55,588 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - KafkaEventSendTimer = 0 13 Jul 2017 14:20:55,588 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - StopTime = 0 13 Jul 2017 14:21:05,596 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - ChannelCapacity = 1000 13 Jul 2017 14:21:05,596 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - ChannelFillPercentage = 0.0 13 Jul 2017 14:21:05,596 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - EventTakeSuccessCount = 0 13 Jul 2017 14:21:05,596 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - ChannelSize = 0 13 Jul 2017 14:21:05,596 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - EventTakeAttemptCount = 3 13 Jul 2017 14:21:05,596 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - StartTime = 1499948454562 13 Jul 2017 14:21:05,596 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - EventPutAttemptCount = 0 13 Jul 2017 14:21:05,596 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - EventPutSuccessCount = 0 13 Jul 2017 14:21:05,596 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - StopTime = 0 13 Jul 2017 14:21:05,596 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - ConnectionCreatedCount = 0 13 Jul 2017 14:21:05,596 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - BatchCompleteCount = 0 13 Jul 2017 14:21:05,597 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - EventDrainAttemptCount = 0 13 Jul 2017 14:21:05,597 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - BatchEmptyCount = 2 13 Jul 2017 14:21:05,597 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - StartTime = 1499948455470 13 Jul 2017 14:21:05,597 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - BatchUnderflowCount = 0 13 Jul 2017 14:21:05,597 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - ConnectionFailedCount = 0 13 Jul 2017 14:21:05,597 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - ConnectionClosedCount = 0 13 Jul 2017 14:21:05,597 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - RollbackCount = 0 13 Jul 2017 14:21:05,597 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - EventDrainSuccessCount = 0 13 Jul 2017 14:21:05,597 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - KafkaEventSendTimer = 0 13 Jul 2017 14:21:05,597 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - StopTime = 0 13 Jul 2017 14:21:15,599 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - ChannelCapacity = 1000 13 Jul 2017 14:21:15,599 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - ChannelFillPercentage = 0.0 13 Jul 2017 14:21:15,599 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - EventTakeSuccessCount = 0 13 Jul 2017 14:21:15,600 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - ChannelSize = 0 13 Jul 2017 14:21:15,600 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - EventTakeAttemptCount = 4 13 Jul 2017 14:21:15,600 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - StartTime = 1499948454562 13 Jul 2017 14:21:15,600 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - EventPutAttemptCount = 0 13 Jul 2017 14:21:15,600 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - EventPutSuccessCount = 0 13 Jul 2017 14:21:15,600 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - StopTime = 0 13 Jul 2017 14:21:15,600 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - ConnectionCreatedCount = 0 13 Jul 2017 14:21:15,600 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - BatchCompleteCount = 0 13 Jul 2017 14:21:15,600 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - EventDrainAttemptCount = 0 13 Jul 2017 14:21:15,600 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - BatchEmptyCount = 4 13 Jul 2017 14:21:15,600 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - StartTime = 1499948455470 13 Jul 2017 14:21:15,600 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - BatchUnderflowCount = 0 13 Jul 2017 14:21:15,601 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - ConnectionFailedCount = 0 13 Jul 2017 14:21:15,601 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - ConnectionClosedCount = 0 13 Jul 2017 14:21:15,601 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - RollbackCount = 0 13 Jul 2017 14:21:15,601 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - EventDrainSuccessCount = 0 13 Jul 2017 14:21:15,601 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - KafkaEventSendTimer = 0 13 Jul 2017 14:21:15,601 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - StopTime = 0 13 Jul 2017 14:21:25,606 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - ChannelCapacity = 1000 13 Jul 2017 14:21:25,606 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - ChannelFillPercentage = 0.1 13 Jul 2017 14:21:25,607 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - EventTakeSuccessCount = 0 13 Jul 2017 14:21:25,607 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - ChannelSize = 1 13 Jul 2017 14:21:25,608 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - EventTakeAttemptCount = 7 13 Jul 2017 14:21:25,608 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - StartTime = 1499948454562 13 Jul 2017 14:21:25,608 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - EventPutAttemptCount = 1 13 Jul 2017 14:21:25,609 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - EventPutSuccessCount = 1 13 Jul 2017 14:21:25,609 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - StopTime = 0 13 Jul 2017 14:21:25,610 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - ConnectionCreatedCount = 0 13 Jul 2017 14:21:25,610 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - BatchCompleteCount = 0 13 Jul 2017 14:21:25,610 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - EventDrainAttemptCount = 0 13 Jul 2017 14:21:25,611 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - BatchEmptyCount = 5 13 Jul 2017 14:21:25,611 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - StartTime = 1499948455470 13 Jul 2017 14:21:25,612 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - BatchUnderflowCount = 0 13 Jul 2017 14:21:25,612 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - ConnectionFailedCount = 0 13 Jul 2017 14:21:25,613 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - ConnectionClosedCount = 0 13 Jul 2017 14:21:25,613 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - RollbackCount = 0 13 Jul 2017 14:21:25,614 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - EventDrainSuccessCount = 0 13 Jul 2017 14:21:25,614 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - KafkaEventSendTimer = 0 13 Jul 2017 14:21:25,615 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - StopTime = 0 [root@sandbox ~]# cat /var/log/flume/flume-demo.log 13 Jul 2017 14:20:54,491 INFO [lifecycleSupervisor-1-0] (org.apache.flume.node.PollingPropertiesFileConfigurationProvider.start:61) - Configuration provider starting 13 Jul 2017 14:20:54,499 INFO [conf-file-poller-0] (org.apache.flume.node.PollingPropertiesFileConfigurationProvider$FileWatcherRunnable.run:133) - Reloading configuration file:/usr/hdp/current/flume-server/conf/demo/flume.conf 13 Jul 2017 14:20:54,504 INFO [conf-file-poller-0] (org.apache.flume.conf.FlumeConfiguration$AgentConfiguration.addProperty:1016) - Processing:sink 13 Jul 2017 14:20:54,505 INFO [conf-file-poller-0] (org.apache.flume.conf.FlumeConfiguration$AgentConfiguration.addProperty:1016) - Processing:sink 13 Jul 2017 14:20:54,506 INFO [conf-file-poller-0] (org.apache.flume.conf.FlumeConfiguration$AgentConfiguration.addProperty:1016) - Processing:sink 13 Jul 2017 14:20:54,506 INFO [conf-file-poller-0] (org.apache.flume.conf.FlumeConfiguration$AgentConfiguration.addProperty:1016) - Processing:sink 13 Jul 2017 14:20:54,506 INFO [conf-file-poller-0] (org.apache.flume.conf.FlumeConfiguration$AgentConfiguration.addProperty:1016) - Processing:sink 13 Jul 2017 14:20:54,506 INFO [conf-file-poller-0] (org.apache.flume.conf.FlumeConfiguration$AgentConfiguration.addProperty:1016) - Processing:sink 13 Jul 2017 14:20:54,507 INFO [conf-file-poller-0] (org.apache.flume.conf.FlumeConfiguration$AgentConfiguration.addProperty:1016) - Processing:sink 13 Jul 2017 14:20:54,507 INFO [conf-file-poller-0] (org.apache.flume.conf.FlumeConfiguration$AgentConfiguration.addProperty:1016) - Processing:sink 13 Jul 2017 14:20:54,507 INFO [conf-file-poller-0] (org.apache.flume.conf.FlumeConfiguration$AgentConfiguration.addProperty:1016) - Processing:sink 13 Jul 2017 14:20:54,507 INFO [conf-file-poller-0] (org.apache.flume.conf.FlumeConfiguration$AgentConfiguration.addProperty:930) - Added sinks: sink Agent: demo 13 Jul 2017 14:20:54,507 INFO [conf-file-poller-0] (org.apache.flume.conf.FlumeConfiguration$AgentConfiguration.addProperty:1016) - Processing:sink 13 Jul 2017 14:20:54,507 INFO [conf-file-poller-0] (org.apache.flume.conf.FlumeConfiguration$AgentConfiguration.addProperty:1016) - Processing:sink 13 Jul 2017 14:20:54,508 INFO [conf-file-poller-0] (org.apache.flume.conf.FlumeConfiguration$AgentConfiguration.addProperty:1016) - Processing:sink 13 Jul 2017 14:20:54,523 INFO [conf-file-poller-0] (org.apache.flume.conf.FlumeConfiguration.validateConfiguration:140) - Post-validation flume configuration contains configuration for agents: [demo] 13 Jul 2017 14:20:54,524 INFO [conf-file-poller-0] (org.apache.flume.node.AbstractConfigurationProvider.loadChannels:150) - Creating channels 13 Jul 2017 14:20:54,532 INFO [conf-file-poller-0] (org.apache.flume.channel.DefaultChannelFactory.create:40) - Creating instance of channel memory_channel type memory 13 Jul 2017 14:20:54,535 INFO [conf-file-poller-0] (org.apache.flume.node.AbstractConfigurationProvider.loadChannels:205) - Created channel memory_channel 13 Jul 2017 14:20:54,536 INFO [conf-file-poller-0] (org.apache.flume.source.DefaultSourceFactory.create:39) - Creating instance of source Netcat, type netcat 13 Jul 2017 14:20:54,542 INFO [conf-file-poller-0] (org.apache.flume.sink.DefaultSinkFactory.create:40) - Creating instance of sink: sink, type: org.apache.flume.sink.kafka.KafkaSink 13 Jul 2017 14:20:54,544 INFO [conf-file-poller-0] (org.apache.flume.sink.kafka.KafkaSink.configure:213) - Using the static topic: flume_topic this may be over-ridden by event headers 13 Jul 2017 14:20:54,544 INFO [conf-file-poller-0] (org.apache.flume.sink.kafka.KafkaSinkUtil.getKafkaProperties:34) - context={ parameters:{security.protocol=SASL_SSL, generateKeytabFor=flume/sandbox.hortonworks.com@HADOOP.COM, ssl.truststore.location=/etc/security/certificate/server.truststore.jks, sasl.mechanism=GSSAPI, channel=memory_channel, topic=flume_topic, bootstrap.servers=sandbox.hortonworks.com:6668,sandbox.hortonworks.com:6668, ssl.truststore.password=sasl_ssl, type=org.apache.flume.sink.kafka.KafkaSink, ssl.truststore.type=JKS, sasl.kerberos.service.name=kafka, brokerList=sandbox.hortonworks.com:6668,sandbox.hortonworks.com:6668} } 13 Jul 2017 14:20:54,555 INFO [conf-file-poller-0] (org.apache.flume.node.AbstractConfigurationProvider.getConfiguration:119) - Channel memory_channel connected to [Netcat, sink] 13 Jul 2017 14:20:54,559 INFO [conf-file-poller-0] (org.apache.flume.node.Application.startAllComponents:138) - Starting new configuration:{ sourceRunners:{Netcat=EventDrivenSourceRunner: { source:org.apache.flume.source.NetcatSource{name:Netcat,state:IDLE} }} sinkRunners:{sink=SinkRunner: { policy:org.apache.flume.sink.DefaultSinkProcessor@7a45af45 counterGroup:{ name:null counters:{} } }} channels:{memory_channel=org.apache.flume.channel.MemoryChannel{name: memory_channel}} } 13 Jul 2017 14:20:54,560 INFO [conf-file-poller-0] (org.apache.flume.node.Application.startAllComponents:145) - Starting Channel memory_channel 13 Jul 2017 14:20:54,561 INFO [conf-file-poller-0] (org.apache.flume.node.Application.startAllComponents:160) - Waiting for channel: memory_channel to start. Sleeping for 500 ms 13 Jul 2017 14:20:54,562 INFO [lifecycleSupervisor-1-2] (org.apache.flume.instrumentation.MonitoredCounterGroup.register:119) - Monitored counter group for type: CHANNEL, name: memory_channel: Successfully registered new MBean. 13 Jul 2017 14:20:54,562 INFO [lifecycleSupervisor-1-2] (org.apache.flume.instrumentation.MonitoredCounterGroup.start:95) - Component type: CHANNEL, name: memory_channel started 13 Jul 2017 14:20:55,064 INFO [conf-file-poller-0] (org.apache.flume.node.Application.startAllComponents:173) - Starting Sink sink 13 Jul 2017 14:20:55,065 INFO [conf-file-poller-0] (org.apache.flume.node.Application.startAllComponents:184) - Starting Source Netcat 13 Jul 2017 14:20:55,092 INFO [lifecycleSupervisor-1-1] (org.apache.flume.source.NetcatSource.start:150) - Source starting 13 Jul 2017 14:20:55,096 INFO [lifecycleSupervisor-1-1] (org.apache.flume.source.NetcatSource.start:164) - Created serverSocket:sun.nio.ch.ServerSocketChannelImpl[/127.0.0.1:56566] 13 Jul 2017 14:20:55,260 INFO [conf-file-poller-0] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink.configure:84) - Context parameters { parameters:{node=sandbox.hortonworks.com:, port=34545, type=org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink} } 13 Jul 2017 14:20:55,261 INFO [lifecycleSupervisor-1-0] (kafka.utils.Logging$class.info:70) - Verifying properties 13 Jul 2017 14:20:55,341 INFO [lifecycleSupervisor-1-0] (kafka.utils.Logging$class.info:70) - Property key.serializer.class is overridden to kafka.serializer.StringEncoder 13 Jul 2017 14:20:55,342 INFO [lifecycleSupervisor-1-0] (kafka.utils.Logging$class.info:70) - Property metadata.broker.list is overridden to sandbox.hortonworks.com:6668,sandbox.hortonworks.com:6668 13 Jul 2017 14:20:55,342 INFO [lifecycleSupervisor-1-0] (kafka.utils.Logging$class.info:70) - Property request.required.acks is overridden to 1 13 Jul 2017 14:20:55,342 INFO [lifecycleSupervisor-1-0] (kafka.utils.Logging$class.info:70) - Property serializer.class is overridden to kafka.serializer.DefaultEncoder 13 Jul 2017 14:20:55,469 INFO [lifecycleSupervisor-1-0] (org.apache.flume.instrumentation.MonitoredCounterGroup.register:119) - Monitored counter group for type: SINK, name: sink: Successfully registered new MBean. 13 Jul 2017 14:20:55,470 INFO [lifecycleSupervisor-1-0] (org.apache.flume.instrumentation.MonitoredCounterGroup.start:95) - Component type: SINK, name: sink started 13 Jul 2017 14:20:55,575 INFO [conf-file-poller-0] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink.start:67) - Starting Flume Metrics Sink 13 Jul 2017 14:20:55,585 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - ChannelCapacity = 1000 13 Jul 2017 14:20:55,587 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - ChannelFillPercentage = 0.0 13 Jul 2017 14:20:55,587 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - EventTakeSuccessCount = 0 13 Jul 2017 14:20:55,587 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - ChannelSize = 0 13 Jul 2017 14:20:55,587 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - EventTakeAttemptCount = 1 13 Jul 2017 14:20:55,587 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - StartTime = 1499948454562 13 Jul 2017 14:20:55,587 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - EventPutAttemptCount = 0 13 Jul 2017 14:20:55,587 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - EventPutSuccessCount = 0 13 Jul 2017 14:20:55,587 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - StopTime = 0 13 Jul 2017 14:20:55,587 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - ConnectionCreatedCount = 0 13 Jul 2017 14:20:55,588 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - BatchCompleteCount = 0 13 Jul 2017 14:20:55,588 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - EventDrainAttemptCount = 0 13 Jul 2017 14:20:55,588 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - BatchEmptyCount = 0 13 Jul 2017 14:20:55,588 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - StartTime = 1499948455470 13 Jul 2017 14:20:55,588 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - BatchUnderflowCount = 0 13 Jul 2017 14:20:55,588 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - ConnectionFailedCount = 0 13 Jul 2017 14:20:55,588 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - ConnectionClosedCount = 0 13 Jul 2017 14:20:55,588 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - RollbackCount = 0 13 Jul 2017 14:20:55,588 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - EventDrainSuccessCount = 0 13 Jul 2017 14:20:55,588 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - KafkaEventSendTimer = 0 13 Jul 2017 14:20:55,588 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - StopTime = 0 13 Jul 2017 14:21:05,596 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - ChannelCapacity = 1000 13 Jul 2017 14:21:05,596 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - ChannelFillPercentage = 0.0 13 Jul 2017 14:21:05,596 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - EventTakeSuccessCount = 0 13 Jul 2017 14:21:05,596 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - ChannelSize = 0 13 Jul 2017 14:21:05,596 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - EventTakeAttemptCount = 3 13 Jul 2017 14:21:05,596 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - StartTime = 1499948454562 13 Jul 2017 14:21:05,596 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - EventPutAttemptCount = 0 13 Jul 2017 14:21:05,596 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - EventPutSuccessCount = 0 13 Jul 2017 14:21:05,596 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - StopTime = 0 13 Jul 2017 14:21:05,596 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - ConnectionCreatedCount = 0 13 Jul 2017 14:21:05,596 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - BatchCompleteCount = 0 13 Jul 2017 14:21:05,597 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - EventDrainAttemptCount = 0 13 Jul 2017 14:21:05,597 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - BatchEmptyCount = 2 13 Jul 2017 14:21:05,597 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - StartTime = 1499948455470 13 Jul 2017 14:21:05,597 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - BatchUnderflowCount = 0 13 Jul 2017 14:21:05,597 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - ConnectionFailedCount = 0 13 Jul 2017 14:21:05,597 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - ConnectionClosedCount = 0 13 Jul 2017 14:21:05,597 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - RollbackCount = 0 13 Jul 2017 14:21:05,597 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - EventDrainSuccessCount = 0 13 Jul 2017 14:21:05,597 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - KafkaEventSendTimer = 0 13 Jul 2017 14:21:05,597 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - StopTime = 0 13 Jul 2017 14:21:15,599 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - ChannelCapacity = 1000 13 Jul 2017 14:21:15,599 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - ChannelFillPercentage = 0.0 13 Jul 2017 14:21:15,599 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - EventTakeSuccessCount = 0 13 Jul 2017 14:21:15,600 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - ChannelSize = 0 13 Jul 2017 14:21:15,600 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - EventTakeAttemptCount = 4 13 Jul 2017 14:21:15,600 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - StartTime = 1499948454562 13 Jul 2017 14:21:15,600 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - EventPutAttemptCount = 0 13 Jul 2017 14:21:15,600 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - EventPutSuccessCount = 0 13 Jul 2017 14:21:15,600 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - StopTime = 0 13 Jul 2017 14:21:15,600 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - ConnectionCreatedCount = 0 13 Jul 2017 14:21:15,600 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - BatchCompleteCount = 0 13 Jul 2017 14:21:15,600 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - EventDrainAttemptCount = 0 13 Jul 2017 14:21:15,600 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - BatchEmptyCount = 4 13 Jul 2017 14:21:15,600 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - StartTime = 1499948455470 13 Jul 2017 14:21:15,600 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - BatchUnderflowCount = 0 13 Jul 2017 14:21:15,601 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - ConnectionFailedCount = 0 13 Jul 2017 14:21:15,601 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - ConnectionClosedCount = 0 13 Jul 2017 14:21:15,601 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - RollbackCount = 0 13 Jul 2017 14:21:15,601 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - EventDrainSuccessCount = 0 13 Jul 2017 14:21:15,601 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - KafkaEventSendTimer = 0 13 Jul 2017 14:21:15,601 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - StopTime = 0 13 Jul 2017 14:21:25,606 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - ChannelCapacity = 1000 13 Jul 2017 14:21:25,606 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - ChannelFillPercentage = 0.1 13 Jul 2017 14:21:25,607 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - EventTakeSuccessCount = 0 13 Jul 2017 14:21:25,607 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - ChannelSize = 1 13 Jul 2017 14:21:25,608 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - EventTakeAttemptCount = 7 13 Jul 2017 14:21:25,608 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - StartTime = 1499948454562 13 Jul 2017 14:21:25,608 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - EventPutAttemptCount = 1 13 Jul 2017 14:21:25,609 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - EventPutSuccessCount = 1 13 Jul 2017 14:21:25,609 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - StopTime = 0 13 Jul 2017 14:21:25,610 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - ConnectionCreatedCount = 0 13 Jul 2017 14:21:25,610 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - BatchCompleteCount = 0 13 Jul 2017 14:21:25,610 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - EventDrainAttemptCount = 0 13 Jul 2017 14:21:25,611 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - BatchEmptyCount = 5 13 Jul 2017 14:21:25,611 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - StartTime = 1499948455470 13 Jul 2017 14:21:25,612 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - BatchUnderflowCount = 0 13 Jul 2017 14:21:25,612 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - ConnectionFailedCount = 0 13 Jul 2017 14:21:25,613 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - ConnectionClosedCount = 0 13 Jul 2017 14:21:25,613 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - RollbackCount = 0 13 Jul 2017 14:21:25,614 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - EventDrainSuccessCount = 0 13 Jul 2017 14:21:25,614 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - KafkaEventSendTimer = 0 13 Jul 2017 14:21:25,615 INFO [pool-4-thread-1] (org.apache.hadoop.metrics2.sink.flume.FlumeTimelineMetricsSink$TimelineMetricsCollector.processComponentAttributes:203) - StopTime = 0 13 Jul 2017 14:21:28,585 INFO [SinkRunner-PollingRunner-DefaultSinkProcessor] (kafka.utils.Logging$class.info:70) - Fetching metadata from broker BrokerEndPoint(0,sandbox.hortonworks.com,6668) with correlation id 0 for 1 topic(s) Set(flume_topic) 13 Jul 2017 14:21:28,587 INFO [SinkRunner-PollingRunner-DefaultSinkProcessor] (kafka.utils.Logging$class.info:70) - Connected to sandbox.hortonworks.com:6668 for producing 13 Jul 2017 14:21:28,620 INFO [SinkRunner-PollingRunner-DefaultSinkProcessor] (kafka.utils.Logging$class.info:70) - Disconnecting from sandbox.hortonworks.com:6668 13 Jul 2017 14:21:28,622 WARN [SinkRunner-PollingRunner-DefaultSinkProcessor] (kafka.utils.Logging$class.warn:91) - Fetching topic metadata with correlation id 0 for topics [Set(flume_topic)] from broker [BrokerEndPoint(0,sandbox.hortonworks.com,6668)] failed java.io.EOFException at org.apache.kafka.common.network.NetworkReceive.readFromReadableChannel(NetworkReceive.java:99) at kafka.network.BlockingChannel.readCompletely(BlockingChannel.scala:140) at kafka.network.BlockingChannel.receive(BlockingChannel.scala:131) at kafka.producer.SyncProducer.liftedTree1$1(SyncProducer.scala:84) at kafka.producer.SyncProducer.kafka$producer$SyncProducer$$doSend(SyncProducer.scala:81) at kafka.producer.SyncProducer.send(SyncProducer.scala:126) at kafka.client.ClientUtils$.fetchTopicMetadata(ClientUtils.scala:59) at kafka.producer.BrokerPartitionInfo.updateInfo(BrokerPartitionInfo.scala:83) at kafka.producer.async.DefaultEventHandler$$anonfun$handle$1.apply$mcV$sp(DefaultEventHandler.scala:73) at kafka.utils.CoreUtils$.swallow(CoreUtils.scala:78) at kafka.utils.Logging$class.swallowError(Logging.scala:108) at kafka.utils.CoreUtils$.swallowError(CoreUtils.scala:49) at kafka.producer.async.DefaultEventHandler.handle(DefaultEventHandler.scala:73) at kafka.producer.Producer.send(Producer.scala:93) at kafka.javaapi.producer.Producer.send(Producer.scala:44) at org.apache.flume.sink.kafka.KafkaSink.process(KafkaSink.java:135) at org.apache.flume.sink.DefaultSinkProcessor.process(DefaultSinkProcessor.java:68) at org.apache.flume.SinkRunner$PollingRunner.run(SinkRunner.java:147) at java.lang.Thread.run(Thread.java:748) 13 Jul 2017 14:21:28,624 INFO [SinkRunner-PollingRunner-DefaultSinkProcessor] (kafka.utils.Logging$class.info:70) - Disconnecting from sandbox.hortonworks.com:6668 13 Jul 2017 14:21:28,624 INFO [SinkRunner-PollingRunner-DefaultSinkProcessor] (kafka.utils.Logging$class.info:70) - Fetching metadata from broker BrokerEndPoint(1,sandbox.hortonworks.com,6668) with correlation id 0 for 1 topic(s) Set(flume_topic) 13 Jul 2017 14:21:28,625 INFO [SinkRunner-PollingRunner-DefaultSinkProcessor] (kafka.utils.Logging$class.info:70) - Connected to sandbox.hortonworks.com:6668 for producing 13 Jul 2017 14:21:28,662 INFO [SinkRunner-PollingRunner-DefaultSinkProcessor] (kafka.utils.Logging$class.info:70) - Disconnecting from sandbox.hortonworks.com:6668 13 Jul 2017 14:21:28,663 WARN [SinkRunner-PollingRunner-DefaultSinkProcessor] (kafka.utils.Logging$class.warn:91) - Fetching topic metadata with correlation id 0 for topics [Set(flume_topic)] from broker [BrokerEndPoint(1,sandbox.hortonworks.com,6668)] failed java.io.EOFException at org.apache.kafka.common.network.NetworkReceive.readFromReadableChannel(NetworkReceive.java:99) at kafka.network.BlockingChannel.readCompletely(BlockingChannel.scala:140) at kafka.network.BlockingChannel.receive(BlockingChannel.scala:131) at kafka.producer.SyncProducer.liftedTree1$1(SyncProducer.scala:84) at kafka.producer.SyncProducer.kafka$producer$SyncProducer$$doSend(SyncProducer.scala:81) at kafka.producer.SyncProducer.send(SyncProducer.scala:126) at kafka.client.ClientUtils$.fetchTopicMetadata(ClientUtils.scala:59) at kafka.producer.BrokerPartitionInfo.updateInfo(BrokerPartitionInfo.scala:83) at kafka.producer.async.DefaultEventHandler$$anonfun$handle$1.apply$mcV$sp(DefaultEventHandler.scala:73) at kafka.utils.CoreUtils$.swallow(CoreUtils.scala:78) at kafka.utils.Logging$class.swallowError(Logging.scala:108) at kafka.utils.CoreUtils$.swallowError(CoreUtils.scala:49) at kafka.producer.async.DefaultEventHandler.handle(DefaultEventHandler.scala:73) at kafka.producer.Producer.send(Producer.scala:93) at kafka.javaapi.producer.Producer.send(Producer.scala:44) at org.apache.flume.sink.kafka.KafkaSink.process(KafkaSink.java:135) at org.apache.flume.sink.DefaultSinkProcessor.process(DefaultSinkProcessor.java:68) at org.apache.flume.SinkRunner$PollingRunner.run(SinkRunner.java:147) at java.lang.Thread.run(Thread.java:748) 13 Jul 2017 14:21:28,663 INFO [SinkRunner-PollingRunner-DefaultSinkProcessor] (kafka.utils.Logging$class.info:70) - Disconnecting from sandbox.hortonworks.com:6668 13 Jul 2017 14:21:28,664 ERROR [SinkRunner-PollingRunner-DefaultSinkProcessor] (kafka.utils.Logging$$anonfun$swallowError$1.apply:108) - fetching topic metadata for topics [Set(flume_topic)] from broker [ArrayBuffer(BrokerEndPoint(0,sandbox.hortonworks.com,6668), BrokerEndPoint(1,sandbox.hortonworks.com,6668))] failed kafka.common.KafkaException: fetching topic metadata for topics [Set(flume_topic)] from broker [ArrayBuffer(BrokerEndPoint(0,sandbox.hortonworks.com,6668), BrokerEndPoint(1,sandbox.hortonworks.com,6668))] failed at kafka.client.ClientUtils$.fetchTopicMetadata(ClientUtils.scala:73) at kafka.producer.BrokerPartitionInfo.updateInfo(BrokerPartitionInfo.scala:83) at kafka.producer.async.DefaultEventHandler$$anonfun$handle$1.apply$mcV$sp(DefaultEventHandler.scala:73) at kafka.utils.CoreUtils$.swallow(CoreUtils.scala:78) at kafka.utils.Logging$class.swallowError(Logging.scala:108) at kafka.utils.CoreUtils$.swallowError(CoreUtils.scala:49) at kafka.producer.async.DefaultEventHandler.handle(DefaultEventHandler.scala:73) at kafka.producer.Producer.send(Producer.scala:93) at kafka.javaapi.producer.Producer.send(Producer.scala:44) at org.apache.flume.sink.kafka.KafkaSink.process(KafkaSink.java:135) at org.apache.flume.sink.DefaultSinkProcessor.process(DefaultSinkProcessor.java:68) at org.apache.flume.SinkRunner$PollingRunner.run(SinkRunner.java:147) at java.lang.Thread.run(Thread.java:748) Caused by: java.io.EOFException at org.apache.kafka.common.network.NetworkReceive.readFromReadableChannel(NetworkReceive.java:99) at kafka.network.BlockingChannel.readCompletely(BlockingChannel.scala:140) at kafka.network.BlockingChannel.receive(BlockingChannel.scala:131) at kafka.producer.SyncProducer.liftedTree1$1(SyncProducer.scala:84) at kafka.producer.SyncProducer.kafka$producer$SyncProducer$$doSend(SyncProducer.scala:81) at kafka.producer.SyncProducer.send(SyncProducer.scala:126) at kafka.client.ClientUtils$.fetchTopicMetadata(ClientUtils.scala:59) ... 12 more 13 Jul 2017 14:21:28,665 INFO [SinkRunner-PollingRunner-DefaultSinkProcessor] (kafka.utils.Logging$class.info:70) - Fetching metadata from broker BrokerEndPoint(0,sandbox.hortonworks.com,6668) with correlation id 1 for 1 topic(s) Set(flume_topic) 13 Jul 2017 14:21:28,666 INFO [SinkRunner-PollingRunner-DefaultSinkProcessor] (kafka.utils.Logging$class.info:70) - Connected to sandbox.hortonworks.com:6668 for producing 13 Jul 2017 14:21:28,710 INFO [SinkRunner-PollingRunner-DefaultSinkProcessor] (kafka.utils.Logging$class.info:70) - Disconnecting from sandbox.hortonworks.com:6668 13 Jul 2017 14:21:28,711 WARN [SinkRunner-PollingRunner-DefaultSinkProcessor] (kafka.utils.Logging$class.warn:91) - Fetching topic metadata with correlation id 1 for topics [Set(flume_topic)] from broker [BrokerEndPoint(0,sandbox.hortonworks.com,6668)] failed java.io.EOFException at org.apache.kafka.common.network.NetworkReceive.readFromReadableChannel(NetworkReceive.java:99) at kafka.network.BlockingChannel.readCompletely(BlockingChannel.scala:140) at kafka.network.BlockingChannel.receive(BlockingChannel.scala:131) at kafka.producer.SyncProducer.liftedTree1$1(SyncProducer.scala:84) at kafka.producer.SyncProducer.kafka$producer$SyncProducer$$doSend(SyncProducer.scala:81) at kafka.producer.SyncProducer.send(SyncProducer.scala:126) at kafka.client.ClientUtils$.fetchTopicMetadata(ClientUtils.scala:59) at kafka.producer.BrokerPartitionInfo.updateInfo(BrokerPartitionInfo.scala:83) at kafka.producer.BrokerPartitionInfo.getBrokerPartitionInfo(BrokerPartitionInfo.scala:50) at kafka.producer.async.DefaultEventHandler.kafka$producer$async$DefaultEventHandler$$getPartitionListForTopic(DefaultEventHandler.scala:206) at kafka.producer.async.DefaultEventHandler$$anonfun$partitionAndCollate$1.apply(DefaultEventHandler.scala:170) at kafka.producer.async.DefaultEventHandler$$anonfun$partitionAndCollate$1.apply(DefaultEventHandler.scala:169) at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59) at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47) at kafka.producer.async.DefaultEventHandler.partitionAndCollate(DefaultEventHandler.scala:169) at kafka.producer.async.DefaultEventHandler.dispatchSerializedData(DefaultEventHandler.scala:101) at kafka.producer.async.DefaultEventHandler.handle(DefaultEventHandler.scala:78) at kafka.producer.Producer.send(Producer.scala:93) at kafka.javaapi.producer.Producer.send(Producer.scala:44) at org.apache.flume.sink.kafka.KafkaSink.process(KafkaSink.java:135) at org.apache.flume.sink.DefaultSinkProcessor.process(DefaultSinkProcessor.java:68) at org.apache.flume.SinkRunner$PollingRunner.run(SinkRunner.java:147) at java.lang.Thread.run(Thread.java:748) 13 Jul 2017 14:21:28,712 INFO [SinkRunner-PollingRunner-DefaultSinkProcessor] (kafka.utils.Logging$class.info:70) - Disconnecting from sandbox.hortonworks.com:6668 13 Jul 2017 14:21:28,712 INFO [SinkRunner-PollingRunner-DefaultSinkProcessor] (kafka.utils.Logging$class.info:70) - Fetching metadata from broker BrokerEndPoint(1,sandbox.hortonworks.com,6668) with correlation id 1 for 1 topic(s) Set(flume_topic) 13 Jul 2017 14:21:28,713 INFO [SinkRunner-PollingRunner-DefaultSinkProcessor] (kafka.utils.Logging$class.info:70) - Connected to sandbox.hortonworks.com:6668 for producing 13 Jul 2017 14:21:28,801 INFO [SinkRunner-PollingRunner-DefaultSinkProcessor] (kafka.utils.Logging$class.info:70) - Disconnecting from sandbox.hortonworks.com:6668 13 Jul 2017 14:21:28,801 WARN [SinkRunner-PollingRunner-DefaultSinkProcessor] (kafka.utils.Logging$class.warn:91) - Fetching topic metadata with correlation id 1 for topics [Set(flume_topic)] from broker [BrokerEndPoint(1,sandbox.hortonworks.com,6668)] failed java.io.EOFException at org.apache.kafka.common.network.NetworkReceive.readFromReadableChannel(NetworkReceive.java:99) at kafka.network.BlockingChannel.readCompletely(BlockingChannel.scala:140) at kafka.network.BlockingChannel.receive(BlockingChannel.scala:131) at kafka.producer.SyncProducer.liftedTree1$1(SyncProducer.scala:84) at kafka.producer.SyncProducer.kafka$producer$SyncProducer$$doSend(SyncProducer.scala:81) at kafka.producer.SyncProducer.send(SyncProducer.scala:126) at kafka.client.ClientUtils$.fetchTopicMetadata(ClientUtils.scala:59) at kafka.producer.BrokerPartitionInfo.updateInfo(BrokerPartitionInfo.scala:83) at kafka.producer.BrokerPartitionInfo.getBrokerPartitionInfo(BrokerPartitionInfo.scala:50) at kafka.producer.async.DefaultEventHandler.kafka$producer$async$DefaultEventHandler$$getPartitionListForTopic(DefaultEventHandler.scala:206) at kafka.producer.async.DefaultEventHandler$$anonfun$partitionAndCollate$1.apply(DefaultEventHandler.scala:170) at kafka.producer.async.DefaultEventHandler$$anonfun$partitionAndCollate$1.apply(DefaultEventHandler.scala:169) at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59) at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47) at kafka.producer.async.DefaultEventHandler.partitionAndCollate(DefaultEventHandler.scala:169) at kafka.producer.async.DefaultEventHandler.dispatchSerializedData(DefaultEventHandler.scala:101) at kafka.producer.async.DefaultEventHandler.handle(DefaultEventHandler.scala:78) at kafka.producer.Producer.send(Producer.scala:93) at kafka.javaapi.producer.Producer.send(Producer.scala:44) at org.apache.flume.sink.kafka.KafkaSink.process(KafkaSink.java:135) at org.apache.flume.sink.DefaultSinkProcessor.process(DefaultSinkProcessor.java:68) at org.apache.flume.SinkRunner$PollingRunner.run(SinkRunner.java:147) at java.lang.Thread.run(Thread.java:748) 13 Jul 2017 14:21:28,802 INFO [SinkRunner-PollingRunner-DefaultSinkProcessor] (kafka.utils.Logging$class.info:70) - Disconnecting from sandbox.hortonworks.com:6668 13 Jul 2017 14:21:28,802 ERROR [SinkRunner-PollingRunner-DefaultSinkProcessor] (kafka.utils.Logging$class.error:99) - Failed to collate messages by topic, partition due to: fetching topic metadata for topics [Set(flume_topic)] from broker [ArrayBuffer(BrokerEndPoint(0,sandbox.hortonworks.com,6668), BrokerEndPoint(1,sandbox.hortonworks.com,6668))] failed 13 Jul 2017 14:21:28,803 INFO [SinkRunner-PollingRunner-DefaultSinkProcessor] (kafka.utils.Logging$class.info:70) - Back off for 100 ms before retrying send. Remaining retries = 3 13 Jul 2017 14:21:28,905 INFO [SinkRunner-PollingRunner-DefaultSinkProcessor] (kafka.utils.Logging$class.info:70) - Fetching metadata from broker BrokerEndPoint(0,sandbox.hortonworks.com,6668) with correlation id 2 for 1 topic(s) Set(flume_topic) 13 Jul 2017 14:21:28,905 INFO [SinkRunner-PollingRunner-DefaultSinkProcessor] (kafka.utils.Logging$class.info:70) - Connected to sandbox.hortonworks.com:6668 for producing 13 Jul 2017 14:21:28,942 INFO [SinkRunner-PollingRunner-DefaultSinkProcessor] (kafka.utils.Logging$class.info:70) - Disconnecting from sandbox.hortonworks.com:6668 13 Jul 2017 14:21:28,943 WARN [SinkRunner-PollingRunner-DefaultSinkProcessor] (kafka.utils.Logging$class.warn:91) - Fetching topic metadata with correlation id 2 for topics [Set(flume_topic)] from broker [BrokerEndPoint(0,sandbox.hortonworks.com,6668)] failed java.io.EOFException at org.apache.kafka.common.network.NetworkReceive.readFromReadableChannel(NetworkReceive.java:99) at kafka.network.BlockingChannel.readCompletely(BlockingChannel.scala:140) at kafka.network.BlockingChannel.receive(BlockingChannel.scala:131) at kafka.producer.SyncProducer.liftedTree1$1(SyncProducer.scala:84) at kafka.producer.SyncProducer.kafka$producer$SyncProducer$$doSend(SyncProducer.scala:81) at kafka.producer.SyncProducer.send(SyncProducer.scala:126) at kafka.client.ClientUtils$.fetchTopicMetadata(ClientUtils.scala:59) at kafka.producer.BrokerPartitionInfo.updateInfo(BrokerPartitionInfo.scala:83) at kafka.producer.async.DefaultEventHandler$$anonfun$handle$2.apply$mcV$sp(DefaultEventHandler.scala:84) at kafka.utils.CoreUtils$.swallow(CoreUtils.scala:78) at kafka.utils.Logging$class.swallowError(Logging.scala:108) at kafka.utils.CoreUtils$.swallowError(CoreUtils.scala:49) at kafka.producer.async.DefaultEventHandler.handle(DefaultEventHandler.scala:84) at kafka.producer.Producer.send(Producer.scala:93) at kafka.javaapi.producer.Producer.send(Producer.scala:44) at org.apache.flume.sink.kafka.KafkaSink.process(KafkaSink.java:135) at org.apache.flume.sink.DefaultSinkProcessor.process(DefaultSinkProcessor.java:68) at org.apache.flume.SinkRunner$PollingRunner.run(SinkRunner.java:147) at java.lang.Thread.run(Thread.java:748) 13 Jul 2017 14:21:28,943 INFO [SinkRunner-PollingRunner-DefaultSinkProcessor] (kafka.utils.Logging$class.info:70) - Disconnecting from sandbox.hortonworks.com:6668 13 Jul 2017 14:21:28,943 INFO [SinkRunner-PollingRunner-DefaultSinkProcessor] (kafka.utils.Logging$class.info:70) - Fetching metadata from broker BrokerEndPoint(1,sandbox.hortonworks.com,6668) with correlation id 2 for 1 topic(s) Set(flume_topic) 13 Jul 2017 14:21:28,943 INFO [SinkRunner-PollingRunner-DefaultSinkProcessor] (kafka.utils.Logging$class.info:70) - Connected to sandbox.hortonworks.com:6668 for producing 13 Jul 2017 14:21:28,976 INFO [SinkRunner-PollingRunner-DefaultSinkProcessor] (kafka.utils.Logging$class.info:70) - Disconnecting from sandbox.hortonworks.com:6668 13 Jul 2017 14:21:28,977 WARN [SinkRunner-PollingRunner-DefaultSinkProcessor] (kafka.utils.Logging$class.warn:91) - Fetching topic metadata with correlation id 2 for topics [Set(flume_topic)] from broker [BrokerEndPoint(1,sandbox.hortonworks.com,6668)] failed java.io.EOFException at org.apache.kafka.common.network.NetworkReceive.readFromReadableChannel(NetworkReceive.java:99) at kafka.network.BlockingChannel.readCompletely(BlockingChannel.scala:140) at kafka.network.BlockingChannel.receive(BlockingChannel.scala:131) at kafka.producer.SyncProducer.liftedTree1$1(SyncProducer.scala:84) at kafka.producer.SyncProducer.kafka$producer$SyncProducer$$doSend(SyncProducer.scala:81) at kafka.producer.SyncProducer.send(SyncProducer.scala:126) at kafka.client.ClientUtils$.fetchTopicMetadata(ClientUtils.scala:59) at kafka.producer.BrokerPartitionInfo.updateInfo(BrokerPartitionInfo.scala:83) at kafka.producer.async.DefaultEventHandler$$anonfun$handle$2.apply$mcV$sp(DefaultEventHandler.scala:84) at kafka.utils.CoreUtils$.swallow(CoreUtils.scala:78) at kafka.utils.Logging$class.swallowError(Logging.scala:108) at kafka.utils.CoreUtils$.swallowError(CoreUtils.scala:49) at kafka.producer.async.DefaultEventHandler.handle(DefaultEventHandler.scala:84) at kafka.producer.Producer.send(Producer.scala:93) at kafka.javaapi.producer.Producer.send(Producer.scala:44) at org.apache.flume.sink.kafka.KafkaSink.process(KafkaSink.java:135) at org.apache.flume.sink.DefaultSinkProcessor.process(DefaultSinkProcessor.java:68) at org.apache.flume.SinkRunner$PollingRunner.run(SinkRunner.java:147) at java.lang.Thread.run(Thread.java:748) 13 Jul 2017 14:21:28,977 INFO [SinkRunner-PollingRunner-DefaultSinkProcessor] (kafka.utils.Logging$class.info:70) - Disconnecting from sandbox.hortonworks.com:6668 13 Jul 2017 14:21:28,977 ERROR [SinkRunner-PollingRunner-DefaultSinkProcessor] (kafka.utils.Logging$$anonfun$swallowError$1.apply:108) - fetching topic metadata for topics [Set(flume_topic)] from broker [ArrayBuffer(BrokerEndPoint(0,sandbox.hortonworks.com,6668), BrokerEndPoint(1,sandbox.hortonworks.com,6668))] failed kafka.common.KafkaException: fetching topic metadata for topics [Set(flume_topic)] from broker [ArrayBuffer(BrokerEndPoint(0,sandbox.hortonworks.com,6668), BrokerEndPoint(1,sandbox.hortonworks.com,6668))] failed at kafka.client.ClientUtils$.fetchTopicMetadata(ClientUtils.scala:73) at kafka.producer.BrokerPartitionInfo.updateInfo(BrokerPartitionInfo.scala:83) at kafka.producer.async.DefaultEventHandler$$anonfun$handle$2.apply$mcV$sp(DefaultEventHandler.scala:84) at kafka.utils.CoreUtils$.swallow(CoreUtils.scala:78) at kafka.utils.Logging$class.swallowError(Logging.scala:108) at kafka.utils.CoreUtils$.swallowError(CoreUtils.scala:49) at kafka.producer.async.DefaultEventHandler.handle(DefaultEventHandler.scala:84) at kafka.producer.Producer.send(Producer.scala:93) at kafka.javaapi.producer.Producer.send(Producer.scala:44) at org.apache.flume.sink.kafka.KafkaSink.process(KafkaSink.java:135) at org.apache.flume.sink.DefaultSinkProcessor.process(DefaultSinkProcessor.java:68) at org.apache.flume.SinkRunner$PollingRunner.run(SinkRunner.java:147) at java.lang.Thread.run(Thread.java:748) Caused by: java.io.EOFException at org.apache.kafka.common.network.NetworkReceive.readFromReadableChannel(NetworkReceive.java:99) at kafka.network.BlockingChannel.readCompletely(BlockingChannel.scala:140) at kafka.network.BlockingChannel.receive(BlockingChannel.scala:131) at kafka.producer.SyncProducer.liftedTree1$1(SyncProducer.scala:84) at kafka.producer.SyncProducer.kafka$producer$SyncProducer$$doSend(SyncProducer.scala:81) at kafka.producer.SyncProducer.send(SyncProducer.scala:126) at kafka.client.ClientUtils$.fetchTopicMetadata(ClientUtils.scala:59) ... 12 more 13 Jul 2017 14:21:28,978 INFO [SinkRunner-PollingRunner-DefaultSinkProcessor] (kafka.utils.Logging$class.info:70) - Fetching metadata from broker BrokerEndPoint(1,sandbox.hortonworks.com,6668) with correlation id 3 for 1 topic(s) Set(flume_topic) 13 Jul 2017 14:21:28,978 INFO [SinkRunner-PollingRunner-DefaultSinkProcessor] (kafka.utils.Logging$class.info:70) - Connected to sandbox.hortonworks.com:6668 for producing