- Subscribe to RSS Feed
- Mark Question as New
- Mark Question as Read
- Float this Question for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page
Dynamic allocation on Spark Standalone cluster
- Labels:
-
Apache Spark
Created 10-04-2016 09:35 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I used Spark Standalone cluster (Ver 1.6.1).
By the way, I want to dynamic allocation setting.
In this case, I have Question.
All dynamic allocation parameter that do operate in standalone mode?
Such as, 'spark.dynamicAllocation.maxExecutors' or 'spark.dynamicAllocation.minExecutors'.
And the executor parameters does it work properly?
Such as, 'spark.executor.cores' and 'spark.executor.memory'.
I look forward to response.
Thanks,
Created 10-04-2016 04:14 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
@Han Jeongphill, while Spark standalone is not officially supported by Hortonworks at this time, dynamic resource allocation is available for standalone mode at the community level. For prerequisites, configuration info, and how to start dynamic allocation on a standalone cluster, see https://spark.apache.org/docs/1.6.1/job-scheduling.html#dynamic-resource-allocation.
Created 10-04-2016 04:14 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
@Han Jeongphill, while Spark standalone is not officially supported by Hortonworks at this time, dynamic resource allocation is available for standalone mode at the community level. For prerequisites, configuration info, and how to start dynamic allocation on a standalone cluster, see https://spark.apache.org/docs/1.6.1/job-scheduling.html#dynamic-resource-allocation.
Created 10-05-2016 05:35 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
@lgeorge, The link you posted has been a great help to me.
Thanks.
