Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

Increasing the memory overhead on the cluster so yarn can run them

Highlighted

Increasing the memory overhead on the cluster so yarn can run them

New Contributor

Hi,

I need to increase the yarn memory overhead for my Spark application to avoid CC memory exceptions but when I run on the cluster the yarn cant run it. Can I increase it on the horthonwork cluster in yarn configurations and then where?

3 REPLIES 3

Re: Increasing the memory overhead on the cluster so yarn can run them

New Contributor
@Lucky_Luke

How are you increasing the yarn memory overhead? Are you specifying spark.yarn.executor.memoryOverhead property in your spark submit command ?

Re: Increasing the memory overhead on the cluster so yarn can run them

New Contributor

yes, i ma doing the same.

Re: Increasing the memory overhead on the cluster so yarn can run them

New Contributor

@Lucky_Luke:
Which version of HDP (yarn) are you using?

when I used HDP 2.4 I did see how to set spark.yarn.executor.memoryOverhead.

In HDP 2.6 I can't find in the ambari gui the place to set it.

How did you set it ?