Options
- Subscribe to RSS Feed
- Mark Question as New
- Mark Question as Read
- Float this Question for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page
Solved
Go to solution
spark yarn-client mode failed due to OutOfMemoryError Exception
Labels:
- Labels:
-
Apache Spark
Contributor
Created ‎02-17-2017 12:11 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Halting due to Out Of Memory Error ...
Exception : java.lang.OutOfMemory Error thrown from the Uncaught Exception Handler in thread "dispatcher-event-loop-0" Halting due to Out Of Memory Error ... Exception : java.lang.OutOfMemory Error thrown from the Uncaught Exception Handler in thread "IPC Client (724929395) connection to /172.22.103.121:39927 from job_1471597554341_0133" Halting due to Out Of Memory Error ... Exception : java.lang.OutOfMemory Error thrown from the Uncaught Exception Handler in thread "main" Halting due to Out Of Memory Error ... Exception : java.lang.OutOfMemory Error thrown from the Uncaught Exception Handler in thread "LeaseRenewer:hrt_qa@nat-r7-movs-oozie-1-5.openstacklocal:8020" Halting due to Out Of Memory Error ... Exception : java.lang.OutOfMemory Error thrown from the Uncaught Exception Handler in thread "shuffle-server-0" End of LogType:stderr
1 ACCEPTED SOLUTION
Contributor
Created ‎02-17-2017 12:32 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
You can try to give --driver-memory to 2g in spark-submit command and see if it helps
2 REPLIES 2
Rising Star
Created ‎02-17-2017 12:23 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
You can probably increase AM memory in "spark.yarn.am.memory", see if it helps. Default is 512m, start with 2g or 4g.
Contributor
Created ‎02-17-2017 12:32 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
You can try to give --driver-memory to 2g in spark-submit command and see if it helps
