03-31-2017 09:45 PM
I am getting garbage collection errors: "java.lang.OutOfMemoryError: GC overhead limit exceeded" Everything that I have read points to heap size.
I have upped all the heap related parameters that I see in my yarn configuration options.
I am trying to run spark-submit with the argument --driver-java-options " -Xmx2048m" and I get the error
"Initial heap size set to a larger value than the maximum heap size"
I am not sure why it says that the maximum heap size is smaller than 2G? I am not sure what else to look at.
04-01-2017 03:16 AM
Don't set heap size this way. Use --driver-memory. This indicates you are actually setting the max heap smaller than your driver memory is configured elsewhere, perhaps in a .conf file.