Support Questions

Find answers, ask questions, and share your expertise

How to kill Spark jobs running in Zeppelin

avatar
Explorer

I have a couple of jobs still running in Zeppelin after several days. Cancelling the jobs in Zeppelin has no effect. I've stopped all services, restarted ambari and even rebooted the servers and the jobs won't go away.

Is there a way to kill these jobs?

Thanks!

1 REPLY 1

avatar
Super Collaborator

Hi @Scott McDowell,

When you initiate a interpreter, that starts a yarn application in cluster,

there are couple of ways to clear the sessions

1. Terminating from Yarn (most effective)

list the yarn application find out the application id running from zeppelin and terminate it.

you may use the Yarn UI (running applications --> select the application --> kill the application - right side top corner of app details)

if you have yarn ACLS and kerberized the cluster 


obtain the ticket (kinit <user>)

yarn application -list

yarn application -kill <appId>

2. By killing the interpreter processor (from terminal) you can celan-up the PIDs of the Interpreter process in Zeppelin server and kill the pid, wil ensures that AM get aborted(might not be graceful).

Hope this helps !!