User X run a %livy.pyspark job in notebook AnalysisX
5 seconds after user Y run a %livy.pyspark job in notebook AnalysisY
Y have to wait for X's spark job to finish, which is not effective.
How is it possible in HDP2.5 through Livy impersonated, to run multiple spark jobs from Zeppeline at the same time?
I think multi-user should run fine...however suspecting resource allocation issue here. zeppelin only support yarn-client for spark interpreter which means the driver will run on the same host as zeppelin server. And if you run spark interpreter in shared mode, then all the user share the same SparkContext. you should increase executor size and executor core in interpreter setting.
The interpreter settings is as followed:
It is very clear that, the Y job says PENDING and when X is finish Y starts.