Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

How many vCores allocated for Tasks within the Executors?

avatar
Contributor

No. of Tasks ~= No. of Blocks

1 REPLY 1

avatar
Mentor
> How many vCores allocated for Tasks within the Executors?

Tasks run inside pre-allocated Executors, and do not cause further
allocations to occur.

Read on below to understand the relationship between tasks and executor
from a resource and concurrency viewpoint:

"""
Every Spark executor in an application has the same fixed number of cores
and same fixed heap size. The number of cores can be specified with the
--executor-cores flag when invoking spark-submit, spark-shell, and pyspark
from the command line, or by setting the spark.executor.cores property in
the spark-defaults.conf file or on a SparkConf object. Similarly, the heap
size can be controlled with the --executor-memory flag or the
spark.executor.memory property. The cores property controls the number of
concurrent tasks an executor can run. --executor-cores 5 means that each
executor can run a maximum of five tasks at the same time.
"""
Read more at
http://blog.cloudera.com/blog/2015/03/how-to-tune-your-apache-spark-jobs-part-2/