Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

set spark.app.name different for each notebook in isolated mode

avatar
Contributor

Hello,

I run spark interpreter in isolated mode. For all the notebooks, spark.app.name is set to zeppelin. Hence in Yarn RM WebUI , the application name is zeppelin for all started notebooks. Is there a way to set spark.app.name different for each notebook ?

6 REPLIES 6

avatar
Expert Contributor

I believe Zeppelin only supports setting spark.app.name per interpreter at the moment:

As a workaround you can make a try to duplicate the default spark interpreter and give unique spark.app.name to each newly created interpreter.

avatar
Super Collaborator

All the interpreter instances of the same type share one single interpreter setting. So as @Tibor Kiss mentioned, for now the workaround is you have to duplicate the default spark interpreter and give unique spark.app.name to each newly created interpreter.

avatar
New Contributor

Sort of. Notebooks themselves are going to run under the name configured in Zeppelin as they are a Zeppelin process. Within a notebook, though, if you are running spark code, you can create an instance which will run under a configured name. This is done via the code below. When said code is executed in zeppelin, it will create a new job in resource manager with the desired name, separate from the one running as zeppelin. Hope this helps!

%spark
import org.apache.spark.SparkConf
import org.apache.spark.SparkContext


val conf = new SparkConf()
conf.set("spark.app.name","DesiredNameHere")
val sc = new SparkContext(conf)

avatar
Contributor

This solution doesn't work.

In spark history server I see the new name, but not in Yarn Web UI

avatar
Contributor

that's probably because your own spark session completed and then you can see it from history server.

default spark session from Zeppelin is long-lived session which will run until you kill it.

avatar
Contributor

remember when you do:

val sc = new SparkContext()

you are opening a new context/session, instead of altering the old one, it's immutable.