Reply
New Contributor
Posts: 1
Registered: ‎12-22-2017

Spark Job Submitting || Determine Number of cores and Executors

Hi,

 

I have 10 node ( 2 name node, 1 edge node, 7 worker nodes ) Cluster. Each worker node memory 128 GB. Each worker node have 16 cores. I have 2TB csv file and i need to read that file using spark at a time. What is the best possible way to determine number of cores and exicuters. If I get memory exception how to ressolve  it ?

Announcements