Spark dynamic allocation allows spark to add and remove the executors dynamically based on your application's demand. But for each executor, it will still respect your settings (e.g. cores and RAM per executor) and use a unified and fixed spec.
Dynamic Allocation [1] controls the number of parallel running executor containers in YARN, not the number of CPU vcore resource allocated to a single executor container.