Support Questions

Find answers, ask questions, and share your expertise

spark-shell java.lang.OutOfMemoryError: PermGen space

avatar
Explorer

I'm using the spark-shell to run a linear regression on a small data set with 5000 observations, but I get a "java.lang.OutOfMemoryError: PermGen space" error message. How can I increase the MaxPermSize for the spark-shell?

 

Thanks.

Stefan

1 ACCEPTED SOLUTION

avatar
Master Collaborator

(By the way, there is a separate forum for Spark: http://community.cloudera.com/t5/Advanced-Analytics-Apache-Spark/bd-p/Spark )

 

If the error is definitely in the shell / REPL, then I believe you just want to set SPARK_REPL_OPTS:

 

SPARK_REPL_OPTS="-XX:MaxPermSize=256m" spark-shell

 

I find this other setting can help with permgen usage in Scala, which is an issue when running tests too:

 

SPARK_REPL_OPTS="-XX:+CMSClassUnloadingEnabled -XX:MaxPermSize=256m" spark-shell

 

 

View solution in original post

2 REPLIES 2

avatar
Master Collaborator

(By the way, there is a separate forum for Spark: http://community.cloudera.com/t5/Advanced-Analytics-Apache-Spark/bd-p/Spark )

 

If the error is definitely in the shell / REPL, then I believe you just want to set SPARK_REPL_OPTS:

 

SPARK_REPL_OPTS="-XX:MaxPermSize=256m" spark-shell

 

I find this other setting can help with permgen usage in Scala, which is an issue when running tests too:

 

SPARK_REPL_OPTS="-XX:+CMSClassUnloadingEnabled -XX:MaxPermSize=256m" spark-shell

 

 

avatar
Explorer

 

Great that solved my problem. i'll use the Spark dedicated forum from now on for Spark related questions thanks for pointing that out.

 

Thanks for your help.

Stefan