Member since
04-07-2023
1
Post
0
Kudos Received
0
Solutions
04-09-2023
09:49 PM
Hi @mrTao It is not good idea to use entire memory Cluster YARN memory. You can tune the memory from spark side by adjusting the memory using following parameters: --conf spark.executor.instances=5
--conf spark.driver.memory=10g
--conf spark.driver.memoryOverhead=1g
--conf spark.executor.memory=10g
--conf spark.executor.memoryOverhead=1g With the above memory configuration, YARN will allocate 66gb(Executor memory 11gb * Executor Instances(5) + Driver memory 11gb = 55g + 11g = 66g). Better check your spark-submit once again and tune the above parameters according to your requirement.
... View more