Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

How to configure DYNAMIC Spark Job

avatar
Contributor

Hi All,

 

Need some guideline for "how to define/design Dynamic SPARK job" any parameter. Please suggest me, it would be more appreciate in advance.

 

Regards,

Pankaj

1 ACCEPTED SOLUTION

avatar
Master Collaborator

Hello @pankshiv1809 

 

Thanks for using Cloudera Community. Based on your Post, Assuming [1] would help i.e. Using Dynamic Allocation to allow Spark adjust resources based on Workload requirement. 

 

Regards, Smarak

 

[1] Job Scheduling - Spark 3.3.1 Documentation (apache.org)

View solution in original post

3 REPLIES 3

avatar
Master Collaborator

Hello @pankshiv1809 

 

Thanks for using Cloudera Community. Based on your Post, Assuming [1] would help i.e. Using Dynamic Allocation to allow Spark adjust resources based on Workload requirement. 

 

Regards, Smarak

 

[1] Job Scheduling - Spark 3.3.1 Documentation (apache.org)

avatar
Master Collaborator

Hello @pankshiv1809 

 

Hope you are doing well. We wish to follow-up on the Post & confirm whether your Team was requesting information into 

Dynamic Allocation to allow Spark adjust resources based on Workload requirement. 

 

Regards, Smarak

 

[1] Job Scheduling - Spark 3.3.1 Documentation (apache.org)

avatar
Master Collaborator

Hello @pankshiv1809 

 

Since we haven't heard from your side concerning the Post, We are marking the Post as Solved. If you have any further ask, Feel free to update the Post & we shall get back to you accordingly.

 

Regards, Smarak