Is it possible to modify the existing scheduling in Spark?
When installed, spark by default has FIFO Scheduling, which is also displayed in the Web UI. So is there a way to change the Scheduling mode, like write our own scheduling algorithm and deploy it in spark. So that while running many sequence of jobs, it can schedule the jobs to the slave nodes, depending on the user specified scheduling.
Is there a way to modify the existing scheduling in spark?
Try going through, http://spark.apache.org/docs/latest/job-scheduling.html
Writing a new job scheduler is a non-trivial task.
What are the specific requirements for which you need to write a custom scheduler?