Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

Can't specify 2G Heap

SOLVED Go to solution

Can't specify 2G Heap

Explorer

I am getting garbage collection errors: "java.lang.OutOfMemoryError: GC overhead limit exceeded" Everything that I have read points to heap size.

 

I have upped all the heap related parameters that I see in my yarn configuration options.  

 

I am trying to run spark-submit with the argument --driver-java-options " -Xmx2048m"  and I get the error

"Initial heap size set to a larger value than the maximum heap size"

 

I am not sure why it says that the maximum heap size is smaller than 2G? I am not sure what else to look at. 

 

Thanks!

 

 

1 ACCEPTED SOLUTION

Accepted Solutions

Re: Can't specify 2G Heap

Master Collaborator

Don't set heap size this way. Use --driver-memory. This indicates you are actually setting the max heap smaller than your driver memory is configured elsewhere, perhaps in a .conf file.

2 REPLIES 2

Re: Can't specify 2G Heap

Master Collaborator

Don't set heap size this way. Use --driver-memory. This indicates you are actually setting the max heap smaller than your driver memory is configured elsewhere, perhaps in a .conf file.

Re: Can't specify 2G Heap

Explorer

I tried this and while my job is still running, it looks like it has gotten farther than it has in the past. Thanks!