Member since
06-13-2018
6
Posts
0
Kudos Received
0
Solutions
06-13-2018
02:45 AM
can you please share the setting that you had set for hive on spark. i have set following setting for running hive-on -spark but not release resource after completing the job. spark-defaults.conf (spark.dynamicAllocation.executorIdleTimeout -> 10). (spark.dynamicAllocation.schedulerBacklogTimeout -> 1) (spark.dynamicAllocation.initialExecutors -> 1) (spark.eventLog.enabled -> true) (spark.master -> yarn-client) (spark.dynamicAllocation.enabled -> true). (spark.dynamicAllocation.minExecutors -> 1) hive-on-spark job is running succcessfully but the job is not removed by application master after successful completiion. thanks.
... View more