- Subscribe to RSS Feed
- Mark Question as New
- Mark Question as Read
- Float this Question for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page
Configure Spark2 with Ambari
- Labels:
-
Apache Ambari
-
Apache Spark
Created ‎09-07-2017 04:18 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi everyone.
We have HDP cluster with several Spark2 clients. I want to perform a remote connection, the manual says that I need to set the SPARK_MASTER_HOST
Is there a way to make such configuration with Ambari UI or I have to edit spark-env.sh? If second, which one I have to choose (namenode, datanode etc)?
Created on ‎09-07-2017 07:15 PM - edited ‎08-18-2019 02:44 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
spark-env.sh is ambari managed file and expose on UI. So you can make changes to it from Amabri UI.
To do so Go to Spark2->Configs->Advanced spark2-env section->content property and add the env variable
Created on ‎09-07-2017 07:15 PM - edited ‎08-18-2019 02:44 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
spark-env.sh is ambari managed file and expose on UI. So you can make changes to it from Amabri UI.
To do so Go to Spark2->Configs->Advanced spark2-env section->content property and add the env variable
Created ‎09-08-2017 03:42 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I set SPARK_MASTER_HOST from the UI as you said, spark-env.sh updated with the line I put, but there is no listening port 7077. What am I missing? And I forgot to mention that we have zookeeper and spark along with spark2 if it makes a sense
Created ‎09-25-2017 07:11 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
My mistake, forgot to run
start-master.sh
after configuring spark-env.sh
Created ‎07-30-2019 12:31 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi,
I can not edit my spark config finals, am i missing something ?
