I have HDInsight cluster of 200GB RAM and 60 Vcores. I am running spark application with the following spark submit command. spark-submit --master yarn --deploy-mode client --num-executors 20 --driver-memory 5G --executor-memory 3G --executor-cores 2 --py-files spark_job.zip $HOME/spark_code/Main.py So, on Spark Web UI it is showing 40 cores (20*2=40) as expected, but on YARN UI it is showing 21 containers using 21 Vcores i.e. Each container using 1 Vcore but it should use 2 Vcores. Does anyone have any idea what is the actual problem, why Yarn container not getting more than 1 Vcore?
... View more