Reply
Explorer
Posts: 13
Registered: ‎04-05-2017

Re: Map and Reduce Error: Java heap space

thanks for replying ,the problem was solved i would like to ask another question , because after research i didn't found the solution , so after installing cloudera manager i got a problem with HDFS "hdfs: Problèmes d'état d'intégrité
HDFS
Blocs sous-répliqués " do you have an idea about the solution ?
Posts: 394
Topics: 11
Kudos: 60
Solutions: 35
Registered: ‎09-02-2016

Re: Map and Reduce Error: Java heap space

@onsbt

 

Can you translate your issue in english? also if it is not related to Java heap space, I would recommend you to create a new thread instead so that it is easy to track and others to contribute as well

Explorer
Posts: 13
Registered: ‎04-05-2017

Re: Map and Reduce Error: Java heap space

thanks for replying 

i  updated this file directly, instead  via Cloudera manager, and i resolve my problem now :) thank you so much , but i have another question i Iam running cloudera with default configuration with one-node cluster, and would like to find where HDFS stores files locally.i create a file in hdfs with hue but when i see /dfs/nn it's empty i can't find the file that i have already created

Posts: 394
Topics: 11
Kudos: 60
Solutions: 35
Registered: ‎09-02-2016

Re: Map and Reduce Error: Java heap space

@onsbt

 

The default path is /opt/hadoop/dfs/nn

 

You can confirm this by Cloudera manager -> HDFS -> Configuration -> search for "dfs.namenode.name.dir"

Explorer
Posts: 13
Registered: ‎04-05-2017

Re: Map and Reduce Error: Java heap space

the path /opt/hadoop/dfs/nn does not exist ,
and when i look for the file that i already created i can't find it in the path

 

 

Highlighted
Posts: 394
Topics: 11
Kudos: 60
Solutions: 35
Registered: ‎09-02-2016

Re: Map and Reduce Error: Java heap space

@onsbt

 

As mentioned already, please create a new topic for new issue as it may mislead others

 

Also please check the full answer and reply, so that you will get desired answer

Contributor
Posts: 33
Registered: ‎05-09-2017

Re: Map and Reduce Error: Java heap space

@saranvisa

 

The last reducer of my mapreduce job fails with the below error. 

 

2017-09-20 16:23:23,732 FATAL [main] org.apache.hadoop.mapred.YarnChild: Error running child : java.lang.OutOfMemoryError: GC overhead limit exceeded
	at java.util.regex.Matcher.<init>(Matcher.java:224)
	at java.util.regex.Pattern.matcher(Pattern.java:1088)
	at java.lang.String.replaceAll(String.java:2162)
	at com.sas.ci.acs.extract.CXAService$myReduce.parseEvent(CXAService.java:1612)
	at com.sas.ci.acs.extract.CXAService$myReduce.reduce(CXAService.java:919)
	at com.sas.ci.acs.extract.CXAService$myReduce.reduce(CXAService.java:237)
	at org.apache.hadoop.mapred.ReduceTask.runOldReducer(ReduceTask.java:444)
	at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:392)
	at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:164)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:415)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1920)
	at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)

2017-09-20 16:23:23,834 INFO [main] org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Stopping ReduceTask metrics system...
2017-09-20 16:23:23,834 INFO [main] org.apache.hadoop.metrics2.impl.MetricsSystemImpl: ReduceTask metrics system stopped.
2017-09-20 16:23:23,834 INFO [main] org.apache.hadoop.metrics2.impl.MetricsSystemImpl: ReduceTask metrics system shutdown complete.

 

Current settings: 

 

mapreduce.map.java.opts-Djava.net.preferIPv4Stack=true -Xmx3865051136

 

mapreduce.reduce.java.opts-Djava.net.preferIPv4Stack=true -Xmx6144067296

 

1) do you recommend increasing the following properties to the below values ?

 

"mapreduce.map.java.opts","-Xmx4g" 
"mapreduce.reduce.java.opts","-Xmx8g" 

 

2) These are my map and reduce memory current settings. Do i also need to bump up my reduce memory to 10240m ? 

 

mapreduce.reduce.memory.mb 8192
mapreduce.reduce.memory.mb 8192

Posts: 394
Topics: 11
Kudos: 60
Solutions: 35
Registered: ‎09-02-2016

Re: Map and Reduce Error: Java heap space

@desind

 

I will not recommend to change your settings, instead you can pass the memory & java Opt when you execute your Jar.

 

 

Ex: Below are some sample value, you can change it as needed.

 

hadoop jar ${JAR_PATH} ${CONFIG_PATH}/filename.xml ${ENV} ${ODATE} mapMem=12288 mapJavaOpts=Xmx9830 redurMem=12288 redurJavaOpts=Xmx9830

 

Note:

mapJavaopts = mapMem * 0.8

redurJavaOpts = redurMem * 0.8

 

Contributor
Posts: 33
Registered: ‎05-09-2017

Re: Map and Reduce Error: Java heap space

@saranvisa

 

 

 

Anything else ?

Posts: 394
Topics: 11
Kudos: 60
Solutions: 35
Registered: ‎09-02-2016

Re: Map and Reduce Error: Java heap space

@desind

 

To add on to your point, the cluster setup is applicable to all the mapreduce job, so it may impact other non-mapreduce jobs. 

 

In fact I am not against setup higher value in cluster itself, but you can do that based on how many jobs requires higher values and performance, etc

 

 

Announcements