Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

Spark Scheduling - Modify existing Scheduling in Spark

Spark Scheduling - Modify existing Scheduling in Spark

Contributor

Is it possible to modify the existing scheduling in Spark?

When installed, spark by default has FIFO Scheduling, which is also displayed in the Web UI. So is there a way to change the Scheduling mode, like write our own scheduling algorithm and deploy it in spark. So that while running many sequence of jobs, it can schedule the jobs to the slave nodes, depending on the user specified scheduling.

Is there a way to modify the existing scheduling in spark?

Thanks

Sridhar

5 REPLIES 5
Highlighted

Re: Spark Scheduling - Modify existing Scheduling in Spark

Try going through, http://spark.apache.org/docs/latest/job-scheduling.html

Writing a new job scheduler is a non-trivial task.

What are the specific requirements for which you need to write a custom scheduler?

Highlighted

Re: Spark Scheduling - Modify existing Scheduling in Spark

Contributor

Schedule the jobs to the slave nodes depending on

Slave nodes processing time

Highlighted

Re: Spark Scheduling - Modify existing Scheduling in Spark

Most requirements are met with Spark with Dynamic Resource Allocation. Have you tried using DRA?

Highlighted

Re: Spark Scheduling - Modify existing Scheduling in Spark

Contributor

@vshukla

Can you send any better tutorial page or site, on how can one use spark dynamic resource allocation?

And also, is it possible to write our own scheduling algorithm code and execute it?

Highlighted

Re: Spark Scheduling - Modify existing Scheduling in Spark

Mentor
Don't have an account?
Coming from Hortonworks? Activate your account here