Support Questions
Find answers, ask questions, and share your expertise

What are Spark executors, executor instances, executor_cores, worker threads, worker nodes and number of executors?

Explorer

Additionally, what exactly does dynamic allocation mean?? (I know it means allocating containers/executors on the fly but please elaborate) What are "spark.dynamicAllocation.maxExecutors"?? What should its value be?

How are each of these parameters related to each other?? What should be the setting of all these parameters for best performance on the cluster?

I read the documentation but I am afraid I am still confused with the terminology. Really appreciate if someone can elaborate.

1 ACCEPTED SOLUTION

Accepted Solutions

@Sree Kupp

This is a great video , explaining everything in detail. Its a 6 hour training but you can skip and listen to the parts that you are interested in.

https://www.youtube.com/watch?v=7ooZ4S7Ay6Y

View solution in original post

4 REPLIES 4

@Sree Kupp

This is a great video , explaining everything in detail. Its a 6 hour training but you can skip and listen to the parts that you are interested in.

https://www.youtube.com/watch?v=7ooZ4S7Ay6Y

View solution in original post

Explorer

Thanks @Kshitij Badani.. will look into it.

@Sree Kupp The part you are interested in starts after 1:20:00, and upto 2:40:00

Explorer

That was a great video @Kshitij Badani. Thanks for the pointers.