Support Questions
Find answers, ask questions, and share your expertise

set different for each notebook in isolated mode


I run spark interpreter in isolated mode. For all the notebooks, is set to zeppelin. Hence in Yarn RM WebUI , the application name is zeppelin for all started notebooks. Is there a way to set different for each notebook ?



I believe Zeppelin only supports setting per interpreter at the moment:

As a workaround you can make a try to duplicate the default spark interpreter and give unique to each newly created interpreter.

Expert Contributor

All the interpreter instances of the same type share one single interpreter setting. So as @Tibor Kiss mentioned, for now the workaround is you have to duplicate the default spark interpreter and give unique to each newly created interpreter.

New Contributor

Sort of. Notebooks themselves are going to run under the name configured in Zeppelin as they are a Zeppelin process. Within a notebook, though, if you are running spark code, you can create an instance which will run under a configured name. This is done via the code below. When said code is executed in zeppelin, it will create a new job in resource manager with the desired name, separate from the one running as zeppelin. Hope this helps!

import org.apache.spark.SparkConf
import org.apache.spark.SparkContext

val conf = new SparkConf()
val sc = new SparkContext(conf)

This solution doesn't work.

In spark history server I see the new name, but not in Yarn Web UI


that's probably because your own spark session completed and then you can see it from history server.

default spark session from Zeppelin is long-lived session which will run until you kill it.


remember when you do:

val sc = new SparkContext()

you are opening a new context/session, instead of altering the old one, it's immutable.

Take a Tour of the Community
Don't have an account?
Your experience may be limited. Sign in to explore more.