Archives of Support Questions (Read Only)

This is an archived board for historical reference. Information and links may no longer be available or relevant
Announcements
This board is archived and read-only for historical reference. To ask a new question, please post a new topic on the appropriate active board.

Can't specify 2G Heap

avatar
Explorer

I am getting garbage collection errors: "java.lang.OutOfMemoryError: GC overhead limit exceeded" Everything that I have read points to heap size.

 

I have upped all the heap related parameters that I see in my yarn configuration options.  

 

I am trying to run spark-submit with the argument --driver-java-options " -Xmx2048m"  and I get the error

"Initial heap size set to a larger value than the maximum heap size"

 

I am not sure why it says that the maximum heap size is smaller than 2G? I am not sure what else to look at. 

 

Thanks!

 

 

1 ACCEPTED SOLUTION

avatar
Master Collaborator

Don't set heap size this way. Use --driver-memory. This indicates you are actually setting the max heap smaller than your driver memory is configured elsewhere, perhaps in a .conf file.

View solution in original post

2 REPLIES 2

avatar
Master Collaborator

Don't set heap size this way. Use --driver-memory. This indicates you are actually setting the max heap smaller than your driver memory is configured elsewhere, perhaps in a .conf file.

avatar
Explorer

I tried this and while my job is still running, it looks like it has gotten farther than it has in the past. Thanks!