- Subscribe to RSS Feed
- Mark Question as New
- Mark Question as Read
- Float this Question for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page
spark-shell java.lang.OutOfMemoryError: PermGen space
- Labels:
-
Apache Spark
Created on ‎04-14-2014 03:44 PM - edited ‎09-16-2022 01:57 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I'm using the spark-shell to run a linear regression on a small data set with 5000 observations, but I get a "java.lang.OutOfMemoryError: PermGen space" error message. How can I increase the MaxPermSize for the spark-shell?
Thanks.
Stefan
Created ‎04-14-2014 11:47 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
(By the way, there is a separate forum for Spark: http://community.cloudera.com/t5/Advanced-Analytics-Apache-Spark/bd-p/Spark )
If the error is definitely in the shell / REPL, then I believe you just want to set SPARK_REPL_OPTS:
SPARK_REPL_OPTS="-XX:MaxPermSize=256m" spark-shell
I find this other setting can help with permgen usage in Scala, which is an issue when running tests too:
SPARK_REPL_OPTS="-XX:+CMSClassUnloadingEnabled -XX:MaxPermSize=256m" spark-shell
Created ‎04-14-2014 11:47 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
(By the way, there is a separate forum for Spark: http://community.cloudera.com/t5/Advanced-Analytics-Apache-Spark/bd-p/Spark )
If the error is definitely in the shell / REPL, then I believe you just want to set SPARK_REPL_OPTS:
SPARK_REPL_OPTS="-XX:MaxPermSize=256m" spark-shell
I find this other setting can help with permgen usage in Scala, which is an issue when running tests too:
SPARK_REPL_OPTS="-XX:+CMSClassUnloadingEnabled -XX:MaxPermSize=256m" spark-shell
Created ‎04-15-2014 09:26 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Great that solved my problem. i'll use the Spark dedicated forum from now on for Spark related questions thanks for pointing that out.
Thanks for your help.
Stefan
