Support Questions
Find answers, ask questions, and share your expertise
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

Spark using only 1 Vcore per executor


Spark using only 1 Vcore per executor

New Contributor

I have HDInsight cluster of 200GB RAM and 60 Vcores. I am running spark application with the following spark submit command.

spark-submit --master yarn --deploy-mode client --num-executors 20 --driver-memory 5G --executor-memory 3G --executor-cores 2 --py-files $HOME/spark_code/

So, on Spark Web UI it is showing 40 cores (20*2=40) as expected, but on YARN UI it is showing 21 containers using 21 Vcores i.e. Each container using 1 Vcore but it should use 2 Vcores. Does anyone have any idea what is the actual problem, why Yarn container not getting more than 1 Vcore?



Don't have an account?
Coming from Hortonworks? Activate your account here