Support Questions

Find answers, ask questions, and share your expertise
Announcements
Check out our newest addition to the community, the Cloudera Data Analytics (CDA) group hub.

yarn and spark memory tuning for the jobs

Explorer

Hi Team,

 

Please suggest any guide or recommendations of tuning the memory parameters for spark jobs.

Eg: To process 500GB input data by spark job how much executors and memory required for each executor etc...

 

3 REPLIES 3

Cloudera Employee

A good starting point is to review mistake number 1, in the slideshare:
https://www.slideshare.net/SparkSummit/top-5-mistakes-when-writing-spark-applications-63071421
This gives a good starting point in tuning the cores, executor memory, etc.

 

Rising Star

Community Manager

@hanumanth Has the reply helped resolve your issue? If so, please mark the appropriate reply as the solution, as it will make it easier for others to find the answer in the future. Thanks


Regards,

Diana Torres,
Community Moderator


Was your question answered? Make sure to mark the answer as the accepted solution.
If you find a reply useful, say thanks by clicking on the thumbs up button.
Learn more about the Cloudera Community:
Take a Tour of the Community
Don't have an account?
Your experience may be limited. Sign in to explore more.