Created on 03-31-2017 09:45 PM - edited 09-16-2022 04:23 AM
I am getting garbage collection errors: "java.lang.OutOfMemoryError: GC overhead limit exceeded" Everything that I have read points to heap size.
I have upped all the heap related parameters that I see in my yarn configuration options.
I am trying to run spark-submit with the argument --driver-java-options " -Xmx2048m" and I get the error
"Initial heap size set to a larger value than the maximum heap size"
I am not sure why it says that the maximum heap size is smaller than 2G? I am not sure what else to look at.
Thanks!
Created 04-01-2017 03:16 AM
Don't set heap size this way. Use --driver-memory. This indicates you are actually setting the max heap smaller than your driver memory is configured elsewhere, perhaps in a .conf file.
Created 04-01-2017 03:16 AM
Don't set heap size this way. Use --driver-memory. This indicates you are actually setting the max heap smaller than your driver memory is configured elsewhere, perhaps in a .conf file.
Created 04-03-2017 02:23 PM
I tried this and while my job is still running, it looks like it has gotten farther than it has in the past. Thanks!