Support Questions
Find answers, ask questions, and share your expertise

Increasing the memory overhead on the cluster so yarn can run them

Explorer

Hi,

I need to increase the yarn memory overhead for my Spark application to avoid CC memory exceptions but when I run on the cluster the yarn cant run it. Can I increase it on the horthonwork cluster in yarn configurations and then where?

3 REPLIES 3

Re: Increasing the memory overhead on the cluster so yarn can run them

Cloudera Employee
@Lucky_Luke

How are you increasing the yarn memory overhead? Are you specifying spark.yarn.executor.memoryOverhead property in your spark submit command ?

Re: Increasing the memory overhead on the cluster so yarn can run them

Explorer

yes, i ma doing the same.

Re: Increasing the memory overhead on the cluster so yarn can run them

Explorer

@Lucky_Luke:
Which version of HDP (yarn) are you using?

when I used HDP 2.4 I did see how to set spark.yarn.executor.memoryOverhead.

In HDP 2.6 I can't find in the ambari gui the place to set it.

How did you set it ?