Created 03-14-2016 05:04 PM
Is it possible to modify the existing scheduling in Spark?
When installed, spark by default has FIFO Scheduling, which is also displayed in the Web UI. So is there a way to change the Scheduling mode, like write our own scheduling algorithm and deploy it in spark. So that while running many sequence of jobs, it can schedule the jobs to the slave nodes, depending on the user specified scheduling.
Is there a way to modify the existing scheduling in spark?
Thanks
Sridhar
Created 03-14-2016 05:18 PM
Try going through, http://spark.apache.org/docs/latest/job-scheduling.html
Writing a new job scheduler is a non-trivial task.
What are the specific requirements for which you need to write a custom scheduler?
Created 03-14-2016 05:41 PM
Schedule the jobs to the slave nodes depending on
Slave nodes processing time
Created 03-14-2016 06:21 PM
Most requirements are met with Spark with Dynamic Resource Allocation. Have you tried using DRA?
Created 03-31-2016 07:59 AM
Can you send any better tutorial page or site, on how can one use spark dynamic resource allocation?
And also, is it possible to write our own scheduling algorithm code and execute it?
Created 03-31-2016 11:56 PM