Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

[Flume] - GC overhead limit exceeded error in flume channel

avatar
Explorer

Hi, I’m working with CDH 5.4.2 and we use flume for data aggregation.When we ran the flume agent, observed the following problem:

 

18:06:2015 11:26:17 ERROR [ChannelProcessor] Error while writing to required channel: org.apache.flume.channel.MemoryChannel{name: channel1}
java.lang.OutOfMemoryError: GC overhead limit exceeded
 
Configuration File details:
agent1.channels.channel1.type = memory
agent1.channels.channel1.capacity = 100000
agent1.channels.channel1.transactionCapactiy = 1000

agent1.sources.source1.channels = channel1 

agent1.sinks = sink1

agent1.sinks.sink1.type = hdfs
agent1.sinks.sink1.channel = channel1
agent1.sinks.sink1.hdfs.useLocalTimeStamp = true
agent1.sinks.sink1.hdfs.path = hdfs://xxx/%Y%m%d%H
agent1.sinks.sink1.hdfs.filePrefix = xdr
agent1.sinks.sink1.hdfs.fileType = DataStream
agent1.sinks.sink1.hdfs.writeFormat = Text
agent1.sinks.sink1.hdfs.rollCount = 2000000
agent1.sinks.sink1.hdfs.rollSize = 0
agent1.sinks.sink1.hdfs.batchSize = 100

 

I really appreciate your help, on this issue.

 

Regards,

Osman Uygar

osmanuygar
1 ACCEPTED SOLUTION

avatar
Explorer
hide-solution

This problem has been solved!

Want to get a detailed solution you have to login/registered on the community

Register/Login
5 REPLIES 5

avatar
Mentor
You are using the memory channel, which sits inside the Java Heap of the Flume process. Your capacity indicates 100k items, so the overall usage if the channel gets full (based on the source's event sizes) can exhaust the heap if there's not sufficient overhead.

What is your current Flume heap size? You may want to double it, or estimate better the size based on average source event size times the channel capacity, plus some breathing room for the Flume daemon's own work.

avatar
Explorer
hide-solution

This problem has been solved!

Want to get a detailed solution you have to login/registered on the community

Register/Login

avatar
Mentor
Glad to know! Please consider marking your post as a solution so it helps others with similar issues arrive at a resolving post quicker.

avatar
New Contributor

hi ,

i am also facing same issue .. but in my case it is not working as per your suggestion .. please help me still facing the same issue ..

 

ERROR hdfs.HDFSEventSink: process failed
java.lang.OutOfMemoryError: GC overhead limit exceeded

avatar
New Contributor
try giving the heap size when you start the flume agent

flume-ng agent -n a1 -f flume_config.conf -Xms256m -Xmx1024m