Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

Hive on Spark with Spark Dynamic Resource Allocation

Highlighted

Hive on Spark with Spark Dynamic Resource Allocation

Explorer

In Hive configuration we have 
spark.dynamicAllocation.enabled = true, actually checked

spark.executor.cores = 6

Does dynamic override the set number of 6? It does not appear dynamic cores are being allocated.

 

spark_cores.png

2 REPLIES 2

Re: Hive on Spark with Spark Dynamic Resource Allocation

Contributor

Hi

 

Spark dynamic allocation allows spark to add and remove the executors dynamically based on your application's demand. But for each executor, it will still respect your settings (e.g. cores and RAM per executor) and use a unified and fixed spec. 

 

Hope it helps.

Re: Hive on Spark with Spark Dynamic Resource Allocation

Master Guru
Dynamic Allocation [1] controls the number of parallel running executor
containers in YARN, not the number of CPU vcore resource allocated to a
single executor container.

[1] -
https://spark.apache.org/docs/latest/job-scheduling.html#dynamic-resource-allocation