Support Questions

Find answers, ask questions, and share your expertise

how many spark execturos runs for the below configuration and how can i tune it.?

avatar
Rising Star

I am using spark standalone cluster and below are my spark-env properties.

export SPARK_EXECUTOR_INSTANCES=432

export SPARK_EXECUTOR_CORES=24

export SPARK_EXECUTOR_MEMORY=36G

export SPARK_DRIVER_MEMORY=24G

I have 6 worker nodes and if i tried to run a job that has huge size of files and joins, it is getting stuck and failing. I could see 6 executors for the job with 24GB. Could you please provide me any links or details to tune it and understand the worker nodes and executors concepts. I referred one cloudera blog, but that is more about yarn. But, i need it for spark standalone cluster

1 ACCEPTED SOLUTION

avatar

Hi @Srinivasarao Daruna HDP does not support Spark in Standalone mode. You need to use Spark on Yarn.

Running Spark in Yarn Cluster mode you can specify number of executors by using the parameter:

--num-executor=6

This will give you 6 executors

For additional information regarding using Yarn Cluster mode please see - http://spark.apache.org/docs/latest/running-on-yar...

Cheers,

Andrew

View solution in original post

1 REPLY 1

avatar

Hi @Srinivasarao Daruna HDP does not support Spark in Standalone mode. You need to use Spark on Yarn.

Running Spark in Yarn Cluster mode you can specify number of executors by using the parameter:

--num-executor=6

This will give you 6 executors

For additional information regarding using Yarn Cluster mode please see - http://spark.apache.org/docs/latest/running-on-yar...

Cheers,

Andrew