Support Questions

Find answers, ask questions, and share your expertise

Spark job keeps on running even after killing application using yarn kill

New Contributor

Submitting a spark job by using shell script as follows:

nohup spark-submit --master yarn-client --py-files libs.zip parser.py --jobname dicomparser >> out.log &

Job started to perform required processing, in between killed using

yarn application -kill <application_id>

and job disappeared from pending list. But could see the process still going on

What am I doing wrong?

1 ACCEPTED SOLUTION

Expert Contributor

Hi @Ravindranath Oruganti you are running spark driver in yarn-client mode i.e. on machine where you initiated the spark-submit command. You must also kill this process where you initiated this command.

View solution in original post

6 REPLIES 6

Expert Contributor

Hi @Ravindranath Oruganti you are running spark driver in yarn-client mode i.e. on machine where you initiated the spark-submit command. You must also kill this process where you initiated this command.

New Contributor

Hi @Saumil Mayani, thanks for the reply. Tried that way and killed the process with name as org.apache.spark.deploy.SparkSubmit and yarn application as well. Still process is going on

Expert Contributor

Could you share which process is running (driver, AM, executors) and where (local, yarn - nodemanagers server)?

New Contributor

I am not sure which process is running but my files(which is part of the job process) are getting processed after killing job

Expert Contributor

Hi @Ravindranath Oruganti could you run the the following on all servers and check.

 ps -ef | grep -i parser.py

New Contributor

Thanks a lot @Saumil Mayani. Process was running for parser.py. After killing those processes, it got stopped. Thanks again