Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

configure spark standalone using ambari

Highlighted

configure spark standalone using ambari

New Contributor

currently i have spark over yarn configured and working (on 9 servers approx 200 CORES)

i would like to configure them in standalone mode ( to prevent the "waist of time of allocation containers" )

is it possible ?

4 REPLIES 4

Re: configure spark standalone using ambari

New Contributor

anyone?

Re: configure spark standalone using ambari

New Contributor

One more time, someone one can help?

Re: configure spark standalone using ambari

New Contributor

bump

Re: configure spark standalone using ambari

Mentor

@ilia987 

 

You can set your deployment mode in configuration files or from the command line when submitting a job. Use one of the following options to set the deployment mode:

  • Client deployment mode.                     [The driver runs locally]
  • Cluster deployment mode.                  [The driver runs on the cluster]

Standalone Cluster Mode
As the name suggests, its a standalone cluster with only spark specific components. It doesn’t have any dependencies on Hadoop components and Spark driver acts as the cluster manager.

To launch a Spark application in cluster mode:

$ ./bin/spark-submit --class path.to.your.Class --master yarn --deploy-mode cluster [options] <app jar> [app options]

 

Single node

To launch a Spark application in client mode, do the same, but replace cluster with the client. The following shows how you can run spark-shell in client mode:

$ ./bin/spark-shell --master yarn --deploy-mode client

 

Hope that helps