- Subscribe to RSS Feed
- Mark Question as New
- Mark Question as Read
- Float this Question for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page
Spark Driver stays Active
- Labels:
-
Apache Spark
-
Apache YARN
Created 02-01-2024 08:17 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hello All,
My spark job is running fine with one issue After all Stages are finished when all the executors are dead on yarn.
my driver stays active and so application id stays active on yarn.
but it is not processing any thing.
All suggestions are Appreciated
Thank you
Created 02-01-2024 11:20 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
@Kunank_kesar, Welcome to our community! To help you get the best possible answer, I have tagged our Spark experts @RangaReddy @mimran who may be able to assist you further.
Please feel free to provide any additional information or details about your query, and we hope that you will find a satisfactory solution to your question.
Regards,
Vidya Sargur,Community Manager
Was your question answered? Make sure to mark the answer as the accepted solution.
If you find a reply useful, say thanks by clicking on the thumbs up button.
Learn more about the Cloudera Community:
Created 02-04-2024 07:40 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Could you please check the following things to resolve the above issue:
1. Have you closed the spark session properly. For better practise close the spark session if it is not closed.
2. Have you checked the application code by adding some loggers what driver is doing with out stopping the application.
3. As a last step, go to driver machine and collect the thread dumps and see is there any operation it is doing internally.
