Reply
New Contributor
Posts: 3
Registered: ‎01-23-2019

Re: Problem about Configuring Flume as Kafka Consumer

So, there were three issues.  First off, you were correct in that the group was not attached to the partition.  I'm sure it's obvious, but I'm new to kafka, so I'm not sure how the groups get associated with a partition.  In my case the partition was already tied to a different group that I'm not familiar with.  I'll need to do some research.

 

Second, my offsets were equal, so I had to reset my offset to 0.

 

Third, my java heap was too small.  

 

Soon as I tweaked those, and adjusted to the proper group, as seen in the kafka-consumer-groups command, data is flowing.  Now to figure out how the groups work.

 

Thanks so much for your help.

Cloudera Employee
Posts: 262
Registered: ‎01-09-2014

Re: Problem about Configuring Flume as Kafka Consumer

Just realized, the log4j setting should go in the flume logging safety valve, not the broker. Also, make sure you can run a kafka-console-consumer and connect to the topic as well, just to make sure its not something with kafka.

-pd
Highlighted
New Contributor
Posts: 1
Registered: ‎02-17-2019

Re: Problem about Configuring Flume as Kafka Consumer

[ Edited ]

Got the same problem here.

btw, log4j could not be modified(log4j.logger.org.apache.kafka=DEBUG) in the quickstart vm 5.13 cuz it is a read-only file, it doesn't allow me to save after the change. @pdvorak

 

Any other help would be highly appreciated! Thanks.

Cloudera Employee
Posts: 262
Registered: ‎01-09-2014

Re: Problem about Configuring Flume as Kafka Consumer

Thats odd that the VM is read only....Are you making the change in CM for the flume logging safety valve?

-pd
Announcements
New solutions