Created 10-04-2016 09:35 AM
I used Spark Standalone cluster (Ver 1.6.1).
By the way, I want to dynamic allocation setting.
In this case, I have Question.
All dynamic allocation parameter that do operate in standalone mode?
Such as, 'spark.dynamicAllocation.maxExecutors' or 'spark.dynamicAllocation.minExecutors'.
And the executor parameters does it work properly?
Such as, 'spark.executor.cores' and 'spark.executor.memory'.
I look forward to response.
Thanks,
Created 10-04-2016 04:14 PM
@Han Jeongphill, while Spark standalone is not officially supported by Hortonworks at this time, dynamic resource allocation is available for standalone mode at the community level. For prerequisites, configuration info, and how to start dynamic allocation on a standalone cluster, see https://spark.apache.org/docs/1.6.1/job-scheduling.html#dynamic-resource-allocation.
Created 10-04-2016 04:14 PM
@Han Jeongphill, while Spark standalone is not officially supported by Hortonworks at this time, dynamic resource allocation is available for standalone mode at the community level. For prerequisites, configuration info, and how to start dynamic allocation on a standalone cluster, see https://spark.apache.org/docs/1.6.1/job-scheduling.html#dynamic-resource-allocation.
Created 10-05-2016 05:35 AM
@lgeorge, The link you posted has been a great help to me.
Thanks.