- last edited on
Please suggest any guide or recommendations of tuning the memory parameters for spark jobs.
Eg: To process 500GB input data by spark job how much executors and memory required for each executor etc...
A good starting point is to review mistake number 1, in the slideshare:https://www.slideshare.net/SparkSummit/top-5-mistakes-when-writing-spark-applications-63071421This gives a good starting point in tuning the cores, executor memory, etc.
You can also review this for spark tuning
@hanumanth Has the reply helped resolve your issue? If so, please mark the appropriate reply as the solution, as it will make it easier for others to find the answer in the future. Thanks