Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

API for heap size

API for heap size

New Contributor

 

Is there a way we can alter heap size using CM API?

4 REPLIES 4

Re: API for heap size

Champion
For which service?

Re: API for heap size

New Contributor

Our developer team are getting below error while running spark job....

Uncaught error from thread [spark-akka.actor.default-dispatcher-3] shutting down JVM since 'akka.jvm-exit-on-fatal-error' is enabled for ActorSystem[spark]
java.lang.OutOfMemoryError: Java heap space

 

I need to automated fix by running API 

Re: API for heap size

Champion
They should be able to set the executor memory size when launching the job. This will give it more heap. You can also set this globally in the Spark configs so that all Spark apps have the larger container size.

./bin/spark-submit \
--class org.apache.spark.examples.SparkPi \
--executor-memory 20G \ <------this setting
/path/to/examples.jar \
1000

spark.executor.memory

Re: API for heap size

New Contributor
Will ask developer to test. Thanks Appreciated