Created 01-12-2017 01:02 PM
My pig script is running fine while am using a low volume of dataset but its throwing OutOfMemoryError while running it on production with a large dataset.
Line 59661: 2017-01-09 13:04:41,610 FATAL [main] org.apache.hadoop.mapred.YarnChild: Error running child : java.lang.OutOfMemoryError: GC overhead limit exceeded Line 73580: 2017-01-09 12:57:44,043 FATAL [main] org.apache.hadoop.mapred.YarnChild: Error running child : java.lang.OutOfMemoryError: GC overhead limit exceeded Line 73580: 2017-01-09 12:57:44,043 FATAL [main] org.apache.hadoop.mapred.YarnChild: Error running child : java.lang.OutOfMemoryError: GC overhead limit exceeded Line 74087: Halting due to Out Of Memory Error... Line 87609: 2017-01-09 12:57:58,235 INFO [communication thread] org.apache.hadoop.mapred.Task: Communication exception: java.lang.OutOfMemoryError: GC overhead limit exceeded Line 87622: 2017-01-09 12:58:09,979 FATAL [IPC Client (378689909) connection to /166.37.225.35:40341 from job_1469684844014_1921354] org.apache.hadoop.yarn.YarnUncaughtExceptionHandler: Thread Thread[IPC Client (378689909) connection to /166.37.225.35:40341 from job_1469684844014_1921354,5,main] threw an Error. Shutting down now...
Created 01-12-2017 01:50 PM
Try to increase your memory. Edit:
export HADOOP_OPTS="-XX:NewRatio=12 -Xmx4096m -XX:MaxHeapFreeRatio=40 -XX:MinHeapFreeRatio=15 -XX:+UseGCOverheadLimit -XX:+UseConcMarkSweepGC"
hive-env template in Advanced hive-env tab. The -Xmx****m parameter represents the amount of memory in MB.
Created 01-12-2017 01:50 PM
Try to increase your memory. Edit:
export HADOOP_OPTS="-XX:NewRatio=12 -Xmx4096m -XX:MaxHeapFreeRatio=40 -XX:MinHeapFreeRatio=15 -XX:+UseGCOverheadLimit -XX:+UseConcMarkSweepGC"
hive-env template in Advanced hive-env tab. The -Xmx****m parameter represents the amount of memory in MB.