Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

Spark job keeps on running even after killing application using yarn kill

avatar

Submitting a spark job by using shell script as follows:

nohup spark-submit --master yarn-client --py-files libs.zip parser.py --jobname dicomparser >> out.log &

Job started to perform required processing, in between killed using

yarn application -kill <application_id>

and job disappeared from pending list. But could see the process still going on

What am I doing wrong?

1 ACCEPTED SOLUTION

avatar
Super Collaborator

Hi @Ravindranath Oruganti you are running spark driver in yarn-client mode i.e. on machine where you initiated the spark-submit command. You must also kill this process where you initiated this command.

View solution in original post

6 REPLIES 6

avatar
Super Collaborator

Hi @Ravindranath Oruganti you are running spark driver in yarn-client mode i.e. on machine where you initiated the spark-submit command. You must also kill this process where you initiated this command.

avatar

Hi @Saumil Mayani, thanks for the reply. Tried that way and killed the process with name as org.apache.spark.deploy.SparkSubmit and yarn application as well. Still process is going on

avatar
Super Collaborator

Could you share which process is running (driver, AM, executors) and where (local, yarn - nodemanagers server)?

avatar

I am not sure which process is running but my files(which is part of the job process) are getting processed after killing job

avatar
Super Collaborator

Hi @Ravindranath Oruganti could you run the the following on all servers and check.

 ps -ef | grep -i parser.py

avatar

Thanks a lot @Saumil Mayani. Process was running for parser.py. After killing those processes, it got stopped. Thanks again