Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

Dynamic allocation on Spark Standalone cluster

avatar
Explorer

I used Spark Standalone cluster (Ver 1.6.1).

By the way, I want to dynamic allocation setting.

In this case, I have Question.

All dynamic allocation parameter that do operate in standalone mode?

Such as, 'spark.dynamicAllocation.maxExecutors' or 'spark.dynamicAllocation.minExecutors'.

And the executor parameters does it work properly?

Such as, 'spark.executor.cores' and 'spark.executor.memory'.

I look forward to response.

Thanks,

1 ACCEPTED SOLUTION

avatar
Super Collaborator

@Han Jeongphill, while Spark standalone is not officially supported by Hortonworks at this time, dynamic resource allocation is available for standalone mode at the community level. For prerequisites, configuration info, and how to start dynamic allocation on a standalone cluster, see https://spark.apache.org/docs/1.6.1/job-scheduling.html#dynamic-resource-allocation.

View solution in original post

2 REPLIES 2

avatar
Super Collaborator

@Han Jeongphill, while Spark standalone is not officially supported by Hortonworks at this time, dynamic resource allocation is available for standalone mode at the community level. For prerequisites, configuration info, and how to start dynamic allocation on a standalone cluster, see https://spark.apache.org/docs/1.6.1/job-scheduling.html#dynamic-resource-allocation.

avatar
Explorer

@lgeorge, The link you posted has been a great help to me.

Thanks.