- Subscribe to RSS Feed
- Mark Question as New
- Mark Question as Read
- Float this Question for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page
Getting Warning: Maximum heap size rounded up to 512 MB while running spark-shell
- Labels:
-
Apache Hadoop
-
Apache Hive
-
Apache Spark
Created 01-24-2023 09:56 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Even after passing the executor memory and driver memory as 1 Gb still it is getting maximum heap size rounded to 512 MB
spark-shell --conf spark.executor.memory=1g --conf spark.driver.memory=1g
Warning: Maximum heap size rounded up to 512 MB
Anything that is missing?
Created 10-24-2023 09:43 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I think you need to verify the yarn and spark resources are configured properly. If yes then go and check from spark ui, it will show driver memory and executor memory. It is coming as expected then safely you can ignore it.
Created 10-24-2023 09:43 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I think you need to verify the yarn and spark resources are configured properly. If yes then go and check from spark ui, it will show driver memory and executor memory. It is coming as expected then safely you can ignore it.
