I have created HDP 2.6 on AWS with 1 master node and 4 worker nodes.I am using cluster management tool Ambari.
I have configured spark-env.sh file on master node now i want to apply all those setting to all worker nodes on cluster.
How to refresh the cluster configuration for reflecting the latest configs to all nodes in the cluster.
Then after saving the changes ambari will show restart required icon for the host components that requires restart and during restart of spark components it will push the spark-env. (spark2-env) related changes to the respective hsots.