Created 10-18-2017 01:36 AM
Hi,
I need to increase the yarn memory overhead for my Spark application to avoid CC memory exceptions but when I run on the cluster the yarn cant run it. Can I increase it on the horthonwork cluster in yarn configurations and then where?
Created 10-18-2017 01:43 AM
How are you increasing the yarn memory overhead? Are you specifying spark.yarn.executor.memoryOverhead property in your spark submit command ?
Created 10-18-2017 02:10 AM
yes, i ma doing the same.
Created 02-05-2018 09:27 AM
@Lucky_Luke:
Which version of HDP (yarn) are you using?
when I used HDP 2.4 I did see how to set spark.yarn.executor.memoryOverhead.
In HDP 2.6 I can't find in the ambari gui the place to set it.
How did you set it ?