Created 10-12-2018 04:16 AM
Hello All,
I observed in our cluster environment that one job is taking 100gb of memory and sometimes it is taking 2gb of memory(even though memory is available).
Please suggest.
Thanks,
Priya
Created 10-14-2018 06:42 PM
Is it mapreduce or Spark?
Mapreduce can vary since it's depend on number of mappers and reducers and mainly mappers.
Created 10-15-2018 10:57 PM