Options
- Subscribe to RSS Feed
- Mark Question as New
- Mark Question as Read
- Float this Question for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page
How many vCores allocated for Tasks within the Executors?
Labels:
- Labels:
-
Apache Spark
Contributor
Created on ‎07-22-2018 04:50 PM - edited ‎09-16-2022 06:30 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
No. of Tasks ~= No. of Blocks
1 REPLY 1
Mentor
Created ‎07-22-2018 05:02 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
> How many vCores allocated for Tasks within the Executors?
Tasks run inside pre-allocated Executors, and do not cause further
allocations to occur.
Read on below to understand the relationship between tasks and executor
from a resource and concurrency viewpoint:
"""
Every Spark executor in an application has the same fixed number of cores
and same fixed heap size. The number of cores can be specified with the
--executor-cores flag when invoking spark-submit, spark-shell, and pyspark
from the command line, or by setting the spark.executor.cores property in
the spark-defaults.conf file or on a SparkConf object. Similarly, the heap
size can be controlled with the --executor-memory flag or the
spark.executor.memory property. The cores property controls the number of
concurrent tasks an executor can run. --executor-cores 5 means that each
executor can run a maximum of five tasks at the same time.
"""
Read more at
http://blog.cloudera.com/blog/2015/03/how-to-tune-your-apache-spark-jobs-part-2/
Tasks run inside pre-allocated Executors, and do not cause further
allocations to occur.
Read on below to understand the relationship between tasks and executor
from a resource and concurrency viewpoint:
"""
Every Spark executor in an application has the same fixed number of cores
and same fixed heap size. The number of cores can be specified with the
--executor-cores flag when invoking spark-submit, spark-shell, and pyspark
from the command line, or by setting the spark.executor.cores property in
the spark-defaults.conf file or on a SparkConf object. Similarly, the heap
size can be controlled with the --executor-memory flag or the
spark.executor.memory property. The cores property controls the number of
concurrent tasks an executor can run. --executor-cores 5 means that each
executor can run a maximum of five tasks at the same time.
"""
Read more at
http://blog.cloudera.com/blog/2015/03/how-to-tune-your-apache-spark-jobs-part-2/
