Created 11-13-2015 06:15 AM
What are the recommended configurations and suggested tuning parameters when building a cluster for Spark workloads?
Do we have a best practices guide around this?
Created 11-16-2015 12:39 PM
Check these links:
https://spark.apache.org/docs/latest/tuning.html
http://www.slideshare.net/SparkSummit/deep-dive-into-project-tungsten-josh-rosen
http://www.slideshare.net/cfregly/advanced-apache-spark-meetup-project-tungsten-nov-12-2015
http://www.slideshare.net/SparkSummit/04-huang-duan-1
http://www.slideshare.net/SparkSummit/making-sense-of-spark-performancekay-ousterhout
Created 11-16-2015 12:39 PM
Check these links:
https://spark.apache.org/docs/latest/tuning.html
http://www.slideshare.net/SparkSummit/deep-dive-into-project-tungsten-josh-rosen
http://www.slideshare.net/cfregly/advanced-apache-spark-meetup-project-tungsten-nov-12-2015
http://www.slideshare.net/SparkSummit/04-huang-duan-1
http://www.slideshare.net/SparkSummit/making-sense-of-spark-performancekay-ousterhout
Created 12-08-2015 05:31 PM
@Laurence Da Luz Are you talking about Spark as a whole (e.g. Spark core)? or SparkSQL? Either way @Guilherme Braccialli links seem to hit all the key topics.
Created 02-02-2016 01:48 AM
@Laurence Da Luz can you accept the best answer to close this thread?