Support Questions
Find answers, ask questions, and share your expertise

Spark Jobs getting failed randomly

Hi All,

I am getting this alert:

java.lang.OutOfMemoryError: Java heap space

19/03/31 02:24:13 WARN AbstractChannelHandlerContext: An exception 'java.lang.OutOfMemoryError: Java heap space' [enable DEBUG level for full stacktrace] was thrown by a user handler's exceptionCaught() method while handling the following exception

WARN TransportChannelHandler: Exception in connection from hostname/ipaddress:7077

java.lang.OutOfMemoryError: Java heap space

afetr restarting the service it will work fine but again it will fail after some time.i need to fix it permanently.please help me in this.thanks in advance.

1 REPLY 1

Re: Spark Jobs getting failed randomly

@ram sriram the error says "java.lang.OutOfMemoryError: Java heap space"

Do evaluate your driver/executor memory and increase them accordingly.