In Hive configuration we have spark.dynamicAllocation.enabled = true, actually checked
spark.executor.cores = 6
Does dynamic override the set number of 6? It does not appear dynamic cores are being allocated.
Spark dynamic allocation allows spark to add and remove the executors dynamically based on your application's demand. But for each executor, it will still respect your settings (e.g. cores and RAM per executor) and use a unified and fixed spec.
Hope it helps.