Created 06-08-2018 11:17 AM
Hi,
I have created HDP 2.6 on AWS with 1 master node and 4 worker nodes.I am using cluster management tool Ambari. I have configured spark-env.sh file on master node now i want to apply all those setting to all worker nodes on cluster.
How to refresh the cluster configuration for reflecting the latest configs to all nodes in the cluster.
Created 06-08-2018 11:22 AM
If it is ambari managed cluster then you shoudl make the changes from Ambari UI
Ambari UI --> Spark --> Configs --> Advanced-- "Advanced spark2-env" Ambari UI --> Spark --> Configs --> Advanced-- "Advanced spark-env"
Then after saving the changes ambari will show restart required icon for the host components that requires restart and during restart of spark components it will push the spark-env. (spark2-env) related changes to the respective hsots.
Created 06-08-2018 11:48 AM
Thanks