Support Questions

Find answers, ask questions, and share your expertise

How to configure DYNAMIC Spark Job

avatar
Contributor

Hi All,

 

Need some guideline for "how to define/design Dynamic SPARK job" any parameter. Please suggest me, it would be more appreciate in advance.

 

Regards,

Pankaj

1 ACCEPTED SOLUTION

avatar
Super Collaborator

Hello @pankshiv1809 

 

Thanks for using Cloudera Community. Based on your Post, Assuming [1] would help i.e. Using Dynamic Allocation to allow Spark adjust resources based on Workload requirement. 

 

Regards, Smarak

 

[1] Job Scheduling - Spark 3.3.1 Documentation (apache.org)

View solution in original post

3 REPLIES 3

avatar
Super Collaborator

Hello @pankshiv1809 

 

Thanks for using Cloudera Community. Based on your Post, Assuming [1] would help i.e. Using Dynamic Allocation to allow Spark adjust resources based on Workload requirement. 

 

Regards, Smarak

 

[1] Job Scheduling - Spark 3.3.1 Documentation (apache.org)

avatar
Super Collaborator

Hello @pankshiv1809 

 

Hope you are doing well. We wish to follow-up on the Post & confirm whether your Team was requesting information into 

Dynamic Allocation to allow Spark adjust resources based on Workload requirement. 

 

Regards, Smarak

 

[1] Job Scheduling - Spark 3.3.1 Documentation (apache.org)

avatar
Super Collaborator

Hello @pankshiv1809 

 

Since we haven't heard from your side concerning the Post, We are marking the Post as Solved. If you have any further ask, Feel free to update the Post & we shall get back to you accordingly.

 

Regards, Smarak