Support Questions
Find answers, ask questions, and share your expertise

Spark Job Submitting || Determine Number of cores and Executors

New Contributor



I have 10 node ( 2 name node, 1 edge node, 7 worker nodes ) Cluster. Each worker node memory 128 GB. Each worker node have 16 cores. I have 2TB csv file and i need to read that file using spark at a time. What is the best possible way to determine number of cores and exicuters. If I get memory exception how to ressolve  it ?