- Subscribe to RSS Feed
- Mark Question as New
- Mark Question as Read
- Float this Question for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page
Spark job keeps on running even after killing application using yarn kill
- Labels:
-
Apache Spark
-
Apache YARN
Created ‎10-04-2017 11:34 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Submitting a spark job by using shell script as follows:
nohup spark-submit --master yarn-client --py-files libs.zip parser.py --jobname dicomparser >> out.log &
Job started to perform required processing, in between killed using
yarn application -kill <application_id>
and job disappeared from pending list. But could see the process still going on
What am I doing wrong?
Created ‎10-04-2017 01:03 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi @Ravindranath Oruganti you are running spark driver in yarn-client mode i.e. on machine where you initiated the spark-submit command. You must also kill this process where you initiated this command.
Created ‎10-04-2017 01:03 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi @Ravindranath Oruganti you are running spark driver in yarn-client mode i.e. on machine where you initiated the spark-submit command. You must also kill this process where you initiated this command.
Created ‎10-04-2017 01:41 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi @Saumil Mayani, thanks for the reply. Tried that way and killed the process with name as org.apache.spark.deploy.SparkSubmit and yarn application as well. Still process is going on
Created ‎10-04-2017 02:20 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Could you share which process is running (driver, AM, executors) and where (local, yarn - nodemanagers server)?
Created ‎10-04-2017 06:31 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I am not sure which process is running but my files(which is part of the job process) are getting processed after killing job
Created ‎10-04-2017 08:21 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi @Ravindranath Oruganti could you run the the following on all servers and check.
ps -ef | grep -i parser.py
Created ‎10-05-2017 05:32 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Thanks a lot @Saumil Mayani. Process was running for parser.py. After killing those processes, it got stopped. Thanks again
