Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

how many spark execturos runs for the below configuration and how can i tune it.?

Highlighted

how many spark execturos runs for the below configuration and how can i tune it.?

Explorer

Hi, 

 

I am using spark standalone cluster and below are my spark-env properties.

 

export SPARK_EXECUTOR_INSTANCES=432
export SPARK_EXECUTOR_CORES=24
export SPARK_EXECUTOR_MEMORY=36G
export SPARK_DRIVER_MEMORY=24G


I have 6 worker nodes and if i tried to run a job that has huge size of files and joins, it is getting stuck and failing. I could see 6 executors for the job with 24GB.

Could you please provide me any links or details to tune it and understand the worker nodes and executors concepts. I referred one cloudera blog, but that is more about yarn. But, i need it for spark standalone cluster

Don't have an account?
Coming from Hortonworks? Activate your account here