New Contributor
Posts: 7
Registered: ‎08-02-2015

Re: Spark Streaming Job submit through Envelope end with OutOfMemory Exception

After reviewing the Envelope source code, i found there was a doesCache() method in which does Cache job and is setting true by default.
I tried to change this default setting to false and re-compile the project.
Using the new jar file to run the spark job, there is no entry in the Spark UI's Storage any more.
And the job is running around 24hrs, up to now, there is no OOM error and i will keep on watching.
Thank you, for everyone's nice suggestion.