I run spark interpreter in isolated mode. For all the notebooks, spark.app.name is set to zeppelin. Hence in Yarn RM WebUI , the application name is zeppelin for all started notebooks. Is there a way to set spark.app.name different for each notebook ?
I believe Zeppelin only supports setting spark.app.name per interpreter at the moment:
As a workaround you can make a try to duplicate the default spark interpreter and give unique spark.app.name to each newly created interpreter.
All the interpreter instances of the same type share one single interpreter setting. So as @Tibor Kiss mentioned, for now the workaround is you have to duplicate the default spark interpreter and give unique spark.app.name to each newly created interpreter.
Sort of. Notebooks themselves are going to run under the name configured in Zeppelin as they are a Zeppelin process. Within a notebook, though, if you are running spark code, you can create an instance which will run under a configured name. This is done via the code below. When said code is executed in zeppelin, it will create a new job in resource manager with the desired name, separate from the one running as zeppelin. Hope this helps!
%spark import org.apache.spark.SparkConf import org.apache.spark.SparkContext val conf = new SparkConf() conf.set("spark.app.name","DesiredNameHere") val sc = new SparkContext(conf)
that's probably because your own spark session completed and then you can see it from history server.
default spark session from Zeppelin is long-lived session which will run until you kill it.