- Subscribe to RSS Feed
- Mark Question as New
- Mark Question as Read
- Float this Question for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page
Getting OutOfMemoryError: GC overhead limit exceeded in production
- Labels:
-
Apache Hadoop
-
Apache Pig
Created ‎01-12-2017 01:02 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
My pig script is running fine while am using a low volume of dataset but its throwing OutOfMemoryError while running it on production with a large dataset.
Line 59661: 2017-01-09 13:04:41,610 FATAL [main] org.apache.hadoop.mapred.YarnChild: Error running child : java.lang.OutOfMemoryError: GC overhead limit exceeded Line 73580: 2017-01-09 12:57:44,043 FATAL [main] org.apache.hadoop.mapred.YarnChild: Error running child : java.lang.OutOfMemoryError: GC overhead limit exceeded Line 73580: 2017-01-09 12:57:44,043 FATAL [main] org.apache.hadoop.mapred.YarnChild: Error running child : java.lang.OutOfMemoryError: GC overhead limit exceeded Line 74087: Halting due to Out Of Memory Error... Line 87609: 2017-01-09 12:57:58,235 INFO [communication thread] org.apache.hadoop.mapred.Task: Communication exception: java.lang.OutOfMemoryError: GC overhead limit exceeded Line 87622: 2017-01-09 12:58:09,979 FATAL [IPC Client (378689909) connection to /166.37.225.35:40341 from job_1469684844014_1921354] org.apache.hadoop.yarn.YarnUncaughtExceptionHandler: Thread Thread[IPC Client (378689909) connection to /166.37.225.35:40341 from job_1469684844014_1921354,5,main] threw an Error. Shutting down now...
Created ‎01-12-2017 01:50 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Try to increase your memory. Edit:
export HADOOP_OPTS="-XX:NewRatio=12 -Xmx4096m -XX:MaxHeapFreeRatio=40 -XX:MinHeapFreeRatio=15 -XX:+UseGCOverheadLimit -XX:+UseConcMarkSweepGC"
hive-env template in Advanced hive-env tab. The -Xmx****m parameter represents the amount of memory in MB.
Created ‎01-12-2017 01:50 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Try to increase your memory. Edit:
export HADOOP_OPTS="-XX:NewRatio=12 -Xmx4096m -XX:MaxHeapFreeRatio=40 -XX:MinHeapFreeRatio=15 -XX:+UseGCOverheadLimit -XX:+UseConcMarkSweepGC"
hive-env template in Advanced hive-env tab. The -Xmx****m parameter represents the amount of memory in MB.
