Created 10-04-2017 11:34 AM
Submitting a spark job by using shell script as follows:
nohup spark-submit --master yarn-client --py-files libs.zip parser.py --jobname dicomparser >> out.log &
Job started to perform required processing, in between killed using
yarn application -kill <application_id>
and job disappeared from pending list. But could see the process still going on
What am I doing wrong?
Created 10-04-2017 01:03 PM
Hi @Ravindranath Oruganti you are running spark driver in yarn-client mode i.e. on machine where you initiated the spark-submit command. You must also kill this process where you initiated this command.
Created 10-04-2017 01:03 PM
Hi @Ravindranath Oruganti you are running spark driver in yarn-client mode i.e. on machine where you initiated the spark-submit command. You must also kill this process where you initiated this command.
Created 10-04-2017 01:41 PM
Hi @Saumil Mayani, thanks for the reply. Tried that way and killed the process with name as org.apache.spark.deploy.SparkSubmit and yarn application as well. Still process is going on
Created 10-04-2017 02:20 PM
Could you share which process is running (driver, AM, executors) and where (local, yarn - nodemanagers server)?
Created 10-04-2017 06:31 PM
I am not sure which process is running but my files(which is part of the job process) are getting processed after killing job
Created 10-04-2017 08:21 PM
Hi @Ravindranath Oruganti could you run the the following on all servers and check.
ps -ef | grep -i parser.py
Created 10-05-2017 05:32 AM
Thanks a lot @Saumil Mayani. Process was running for parser.py. After killing those processes, it got stopped. Thanks again