Support Questions

Find answers, ask questions, and share your expertise

Stopping Spark through Ambari

avatar
Rising Star

Hi,

Just wondering what happens when Spark service is chosen on Ambari and is stopped? What service is stopped behind the scenes.To my understanding, when Spark is installed through Ambari, it installs Spark-client, thrift-server and history-server. When I stop Spark through Ambari, what action is invoked. Is it the spark-client that is stopped or all the three?

Alternatively, if I have to stop spark-client through CLI, how it need to be done?

Please correct me in case my understanding is wrong in any of the above.

Thanks

1 ACCEPTED SOLUTION

avatar
Master Mentor

@Greenhorn Techie

This is what it looks like

When you stop through ambari then it will stop the following/all components.

1804-screen-shot-2016-02-04-at-112144-am.png

If just want to start or stop particular component then you can click that particular service and take the action.

Click Spark Thrift Server and you can just start or stop that .

I clicked thrift server and I can start from there...

1805-screen-shot-2016-02-04-at-112512-am.png

View solution in original post

13 REPLIES 13

avatar
Master Mentor

@Greenhorn Techie it stops all spark related services via Ambari. I wouldn't recommend stopping Ambari managed service via shell.

avatar
Rising Star

@Artem Ervits Thanks for the quick response. I have two clients of Spark at the moment i.e. one is Ambari managed and the other is outside, manually setup. So, if I have to stop the manually managed one, how to do it?

Thanks

avatar
Master Mentor

@Greenhorn Techie

This is what it looks like

When you stop through ambari then it will stop the following/all components.

1804-screen-shot-2016-02-04-at-112144-am.png

If just want to start or stop particular component then you can click that particular service and take the action.

Click Spark Thrift Server and you can just start or stop that .

I clicked thrift server and I can start from there...

1805-screen-shot-2016-02-04-at-112512-am.png

avatar
Rising Star

@Neeraj Sabharwal, Thanks. Please see my below follow-up query.

avatar
Master Mentor

@Greenhorn Techie

Please see this and you can use that landing for all your future questions on manual install/use

Now, going back to your requirement...Please take few minutes to read this http://spark.apache.org/docs/latest/spark-standalone.html

  • sbin/start-all.sh - Starts both a master and a number of slaves as described above.
  • sbin/stop-master.sh - Stops the master that was started via the bin/start-master.sh script.
  • sbin/stop-slaves.sh - Stops all slave instances on the machines specified in the conf/slaves file.
  • sbin/stop-all.sh - Stops both the master and the slaves as described above.

avatar
Rising Star

@Neeraj Sabharwal I believe these are applicable for spark standalone mode. Correct me if I'm wrong. However, I wanted to understand how it works in yarn mode. Specifically my question is the below:

@Artem mentioned that 'it' would stop immediately after app execution from a shell. My question is what is 'it' here? Is it spark client or spark driver or the app itself. Alternatively, on Ambari what is it shown that is running which can be stopped through UI. I presume its the spark-client. If so, what need to be done to stop the spark-client, similar to the one done through Ambari.

avatar
Master Mentor

@Greenhorn Techie spark client is client so there is no start and stop.

As you can see from my screeen shot , when you stop spark , it will stop all those components.

Now , for example when you run the following, it will launch a spark job and shutdown the job once it finishes.

Run on a YARN cluster
export HADOOP_CONF_DIR=XXX
./bin/spark-submit \
  --class org.apache.spark.examples.SparkPi \
  --master yarn-cluster \  # can also be `yarn-client` for client mode
  --executor-memory 20G \
  --num-executors 50 \
  /path/to/examples.jar \
  1000

avatar
Rising Star

@Neeraj Sabharwal Thanks. I agree with all the response given so far. Here is my understanding to summarise.

Only spark-thrift-server or spark-history-server can be stopped - either through Ambari or through CLI.

spark-client can only be installed (put the required libraries on the specified machine's directory) or uninstalled. There is nothing like stop or start. Same is the case through Ambari.

When using spark-shell or spark-submit, it would run interactively / submit the job to the cluster and once the application completes running, spark-driver program is ended.

avatar
Master Mentor

@Greenhorn Techie Thanks for being so specific ....Perfect!! 🙂