Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

Spark using only 1 Vcore per executor

Highlighted

Spark using only 1 Vcore per executor

New Contributor

I have HDInsight cluster of 200GB RAM and 60 Vcores. I am running spark application with the following spark submit command.

spark-submit --master yarn --deploy-mode client --num-executors 20 --driver-memory 5G --executor-memory 3G --executor-cores 2 --py-files spark_job.zip $HOME/spark_code/Main.py


So, on Spark Web UI it is showing 40 cores (20*2=40) as expected, but on YARN UI it is showing 21 containers using 21 Vcores i.e. Each container using 1 Vcore but it should use 2 Vcores. Does anyone have any idea what is the actual problem, why Yarn container not getting more than 1 Vcore?



107828-1555055491267.png


107765-1555055404744.png

Don't have an account?
Coming from Hortonworks? Activate your account here