Need some guideline for "how to define/design Dynamic SPARK job" any parameter. Please suggest me, it would be more appreciate in advance.
Thanks for using Cloudera Community. Based on your Post, Assuming  would help i.e. Using Dynamic Allocation to allow Spark adjust resources based on Workload requirement.
 Job Scheduling - Spark 3.3.1 Documentation (apache.org)
View solution in original post
Hope you are doing well. We wish to follow-up on the Post & confirm whether your Team was requesting information into
Dynamic Allocation to allow Spark adjust resources based on Workload requirement.
Since we haven't heard from your side concerning the Post, We are marking the Post as Solved. If you have any further ask, Feel free to update the Post & we shall get back to you accordingly.