Support Questions

Find answers, ask questions, and share your expertise

Getting Warning: Maximum heap size rounded up to 512 MB while running spark-shell

Explorer

Even after passing the executor memory and driver memory as 1 Gb still it is getting maximum heap size rounded to 512 MB

 

spark-shell --conf spark.executor.memory=1g --conf spark.driver.memory=1g
Warning: Maximum heap size rounded up to 512 MB

 

Anything that is missing?

0 REPLIES 0