Created on 06-19-2015 05:35 AM - edited 09-16-2022 02:31 AM
Hi, I’m working with CDH 5.4.2 and we use flume for data aggregation.When we ran the flume agent, observed the following problem:
18:06:2015 11:26:17 ERROR [ChannelProcessor] Error while writing to required channel: org.apache.flume.channel.MemoryChannel{name: channel1} java.lang.OutOfMemoryError: GC overhead limit exceeded
agent1.channels.channel1.type = memory agent1.channels.channel1.capacity = 100000 agent1.channels.channel1.transactionCapactiy = 1000 agent1.sources.source1.channels = channel1 agent1.sinks = sink1 agent1.sinks.sink1.type = hdfs agent1.sinks.sink1.channel = channel1 agent1.sinks.sink1.hdfs.useLocalTimeStamp = true agent1.sinks.sink1.hdfs.path = hdfs://xxx/%Y%m%d%H agent1.sinks.sink1.hdfs.filePrefix = xdr agent1.sinks.sink1.hdfs.fileType = DataStream agent1.sinks.sink1.hdfs.writeFormat = Text agent1.sinks.sink1.hdfs.rollCount = 2000000 agent1.sinks.sink1.hdfs.rollSize = 0 agent1.sinks.sink1.hdfs.batchSize = 100
I really appreciate your help, on this issue.
Regards,
Osman Uygar
Created 07-15-2015 03:51 AM
We resolved this problem when adding below line to flume-env.sh;
export JAVA_OPTS="-Xms100m -Xmx2000m -Dcom.sun.management.jmxremote"
Thanks for your response
Created 07-15-2015 02:28 AM
Created 07-15-2015 03:51 AM
We resolved this problem when adding below line to flume-env.sh;
export JAVA_OPTS="-Xms100m -Xmx2000m -Dcom.sun.management.jmxremote"
Thanks for your response
Created 07-15-2015 03:55 AM
Created 05-22-2017 12:21 PM
hi ,
i am also facing same issue .. but in my case it is not working as per your suggestion .. please help me still facing the same issue ..
ERROR hdfs.HDFSEventSink: process failed
java.lang.OutOfMemoryError: GC overhead limit exceeded
Created 02-18-2019 01:07 PM