Support Questions

Find answers, ask questions, and share your expertise

yarn and spark memory tuning for the jobs

avatar
Contributor

Hi Team,

 

Please suggest any guide or recommendations of tuning the memory parameters for spark jobs.

Eg: To process 500GB input data by spark job how much executors and memory required for each executor etc...

 

3 REPLIES 3

avatar
Contributor

A good starting point is to review mistake number 1, in the slideshare:
https://www.slideshare.net/SparkSummit/top-5-mistakes-when-writing-spark-applications-63071421
This gives a good starting point in tuning the cores, executor memory, etc.

 

avatar
Expert Contributor

avatar
Community Manager

@hanumanth Has the reply helped resolve your issue? If so, please mark the appropriate reply as the solution, as it will make it easier for others to find the answer in the future. Thanks


Regards,

Diana Torres,
Community Moderator


Was your question answered? Make sure to mark the answer as the accepted solution.
If you find a reply useful, say thanks by clicking on the thumbs up button.
Learn more about the Cloudera Community: