<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>question Getting OutOfMemoryError: GC overhead limit exceeded in production in Archives of Support Questions (Read Only)</title>
    <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Getting-OutOfMemoryError-GC-overhead-limit-exceeded-in/m-p/126536#M51459</link>
    <description>&lt;P&gt;My pig script is running fine while am using a low volume of dataset but its throwing OutOfMemoryError while running it on production with a large dataset.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;Line 59661: 2017-01-09 13:04:41,610 FATAL [main] org.apache.hadoop.mapred.YarnChild: Error running child : java.lang.OutOfMemoryError: GC overhead limit exceeded
	Line 73580: 2017-01-09 12:57:44,043 FATAL [main] org.apache.hadoop.mapred.YarnChild: Error running child : java.lang.OutOfMemoryError: GC overhead limit exceeded
	Line 73580: 2017-01-09 12:57:44,043 FATAL [main] org.apache.hadoop.mapred.YarnChild: Error running child : java.lang.OutOfMemoryError: GC overhead limit exceeded
	Line 74087: Halting due to Out Of Memory Error...
	Line 87609: 2017-01-09 12:57:58,235 INFO [communication thread] org.apache.hadoop.mapred.Task: Communication exception: java.lang.OutOfMemoryError: GC overhead limit exceeded
	Line 87622: 2017-01-09 12:58:09,979 FATAL [IPC Client (378689909) connection to /166.37.225.35:40341 from job_1469684844014_1921354] org.apache.hadoop.yarn.YarnUncaughtExceptionHandler: Thread Thread[IPC Client (378689909) connection to /166.37.225.35:40341 from job_1469684844014_1921354,5,main] threw an Error. Shutting down now...&lt;/P&gt;</description>
    <pubDate>Thu, 12 Jan 2017 21:02:57 GMT</pubDate>
    <dc:creator>das_dineshk</dc:creator>
    <dc:date>2017-01-12T21:02:57Z</dc:date>
    <item>
      <title>Getting OutOfMemoryError: GC overhead limit exceeded in production</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Getting-OutOfMemoryError-GC-overhead-limit-exceeded-in/m-p/126536#M51459</link>
      <description>&lt;P&gt;My pig script is running fine while am using a low volume of dataset but its throwing OutOfMemoryError while running it on production with a large dataset.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;Line 59661: 2017-01-09 13:04:41,610 FATAL [main] org.apache.hadoop.mapred.YarnChild: Error running child : java.lang.OutOfMemoryError: GC overhead limit exceeded
	Line 73580: 2017-01-09 12:57:44,043 FATAL [main] org.apache.hadoop.mapred.YarnChild: Error running child : java.lang.OutOfMemoryError: GC overhead limit exceeded
	Line 73580: 2017-01-09 12:57:44,043 FATAL [main] org.apache.hadoop.mapred.YarnChild: Error running child : java.lang.OutOfMemoryError: GC overhead limit exceeded
	Line 74087: Halting due to Out Of Memory Error...
	Line 87609: 2017-01-09 12:57:58,235 INFO [communication thread] org.apache.hadoop.mapred.Task: Communication exception: java.lang.OutOfMemoryError: GC overhead limit exceeded
	Line 87622: 2017-01-09 12:58:09,979 FATAL [IPC Client (378689909) connection to /166.37.225.35:40341 from job_1469684844014_1921354] org.apache.hadoop.yarn.YarnUncaughtExceptionHandler: Thread Thread[IPC Client (378689909) connection to /166.37.225.35:40341 from job_1469684844014_1921354,5,main] threw an Error. Shutting down now...&lt;/P&gt;</description>
      <pubDate>Thu, 12 Jan 2017 21:02:57 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Getting-OutOfMemoryError-GC-overhead-limit-exceeded-in/m-p/126536#M51459</guid>
      <dc:creator>das_dineshk</dc:creator>
      <dc:date>2017-01-12T21:02:57Z</dc:date>
    </item>
    <item>
      <title>Re: Getting OutOfMemoryError: GC overhead limit exceeded in production</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Getting-OutOfMemoryError-GC-overhead-limit-exceeded-in/m-p/126537#M51460</link>
      <description>&lt;P&gt;Try to increase your memory. Edit:&lt;/P&gt;&lt;PRE&gt;export HADOOP_OPTS="-XX:NewRatio=12 -Xmx4096m -XX:MaxHeapFreeRatio=40 -XX:MinHeapFreeRatio=15 -XX:+UseGCOverheadLimit -XX:+UseConcMarkSweepGC"&lt;/PRE&gt;&lt;P&gt;hive-env template in Advanced hive-env tab. The -Xmx****m parameter represents the amount of memory in MB.&lt;/P&gt;</description>
      <pubDate>Thu, 12 Jan 2017 21:50:39 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Getting-OutOfMemoryError-GC-overhead-limit-exceeded-in/m-p/126537#M51460</guid>
      <dc:creator>frank93</dc:creator>
      <dc:date>2017-01-12T21:50:39Z</dc:date>
    </item>
  </channel>
</rss>

