Support Questions
Find answers, ask questions, and share your expertise

Can't specify 2G Heap

Explorer

I am getting garbage collection errors: "java.lang.OutOfMemoryError: GC overhead limit exceeded" Everything that I have read points to heap size.

 

I have upped all the heap related parameters that I see in my yarn configuration options.  

 

I am trying to run spark-submit with the argument --driver-java-options " -Xmx2048m"  and I get the error

"Initial heap size set to a larger value than the maximum heap size"

 

I am not sure why it says that the maximum heap size is smaller than 2G? I am not sure what else to look at. 

 

Thanks!

 

 

1 ACCEPTED SOLUTION

Accepted Solutions

Master Collaborator

Don't set heap size this way. Use --driver-memory. This indicates you are actually setting the max heap smaller than your driver memory is configured elsewhere, perhaps in a .conf file.

View solution in original post

2 REPLIES 2

Master Collaborator

Don't set heap size this way. Use --driver-memory. This indicates you are actually setting the max heap smaller than your driver memory is configured elsewhere, perhaps in a .conf file.

View solution in original post

Explorer

I tried this and while my job is still running, it looks like it has gotten farther than it has in the past. Thanks!