Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

How to choose the queue in which you want to submit the job when using Zeppelin?

avatar
Contributor

Hi,

I am using Apache Zeppelin on HDP 2.4 and was wondering how would i choose/change the queue in which my jobs are being submitted? Currently it goes into the default queue but I would like to dedicate a queue for zeppelin alone, and run all jobs submitted from zeppelin there. But I am unable to switch the queue. Can anyone suggest me how to do it?

Thanks

1 ACCEPTED SOLUTION

avatar
Guru

If you are using zeppelin for spark, you can change JAVA_OPTS in zeppelin-env on zepplien configs and add something like

-Dspark.yarn.queue=my_zeppelin_queuename

You can add mapreduce and tez queues as well in JAVA_OPTS

View solution in original post

4 REPLIES 4

avatar
Guru

If you are using zeppelin for spark, you can change JAVA_OPTS in zeppelin-env on zepplien configs and add something like

-Dspark.yarn.queue=my_zeppelin_queuename

You can add mapreduce and tez queues as well in JAVA_OPTS

avatar
Super Guru
@vagrawalSetup YARN queue:
  • (Optional) You can setup/configure a YARN queue to customize what portion of the cluster the Spark job should use. To do this follow the two steps below:

    i. Open the Yarn Queue Manager view to setup a queue for Spark with below capacities:

    • Capacity: 50%
    • Max Capacity: 90% (on sandbox, do not reduce below this or the Spark jobs will not run)

screen-shot-2016-05-25-at-92342-pm.png

avatar

In such case, if we like to define livy.spark.yarn.queue, should it be same as spark.yarn.queue or it could be different queue?

Thank you!

Leonid

,

In such case, can we define livy.spark.yarn.queue to pint to different queue, or it have to be same as spark.yarn.queue?

avatar
Master Guru

I was able to set Livy queue by just setting livy.spark.yarn.queue=mylivyqueue in the Livy interpreter in Zeppelin and after restarting the Interpreter, Livy notebooks start runnin on that queue. By the way, my spark.yarn.queue=default.