Created 09-07-2017 04:18 AM
Hi everyone.
We have HDP cluster with several Spark2 clients. I want to perform a remote connection, the manual says that I need to set the SPARK_MASTER_HOST
Is there a way to make such configuration with Ambari UI or I have to edit spark-env.sh? If second, which one I have to choose (namenode, datanode etc)?
Created on 09-07-2017 07:15 PM - edited 08-18-2019 02:44 AM
spark-env.sh is ambari managed file and expose on UI. So you can make changes to it from Amabri UI.
To do so Go to Spark2->Configs->Advanced spark2-env section->content property and add the env variable
Created on 09-07-2017 07:15 PM - edited 08-18-2019 02:44 AM
spark-env.sh is ambari managed file and expose on UI. So you can make changes to it from Amabri UI.
To do so Go to Spark2->Configs->Advanced spark2-env section->content property and add the env variable
Created 09-08-2017 03:42 AM
I set SPARK_MASTER_HOST from the UI as you said, spark-env.sh updated with the line I put, but there is no listening port 7077. What am I missing? And I forgot to mention that we have zookeeper and spark along with spark2 if it makes a sense
Created 09-25-2017 07:11 AM
My mistake, forgot to run
start-master.sh
after configuring spark-env.sh
Created 07-30-2019 12:31 AM
Hi,
I can not edit my spark config finals, am i missing something ?