Archives of Support Questions (Read Only)

This is an archived board for historical reference. Information and links may no longer be available or relevant
Announcements
This board is archived and read-only for historical reference. To ask a new question, please post a new topic on the appropriate active board.

Spark2 running forever in zeppelin

avatar
New Member

This spark job is running forever, and I cannot seem to stop it.

1) I have restarted spark2 interpreter within Zeppelin

2) I have restarted/Start/stopped Zeppelin from HDP

3) I have tried stopping it from jobs

In all cases it is not responsive. and still keeps on running.

How can I kill thois job ?

76526-2.png

76525-1.png

1 ACCEPTED SOLUTION

avatar

@Victor Usually a restart of the interpreter also kills the yarn application. Check on RM UI, there should be a yarn application running for the interpreter. If you kill the yarn application then you will probably stop this. I see code is pretty simple. To debug you should check the spark2 interpreter log and yarn application log to find out what is happening.

View solution in original post

4 REPLIES 4

avatar

@Victor Usually a restart of the interpreter also kills the yarn application. Check on RM UI, there should be a yarn application running for the interpreter. If you kill the yarn application then you will probably stop this. I see code is pretty simple. To debug you should check the spark2 interpreter log and yarn application log to find out what is happening.

avatar
New Member

@Felix Albani These are my RM UI and Spark UI. How can I stop this from running ?

76528-1.png

76529-2.png

avatar

@Victor To kill it you can issue command:

yarn application -kill <Application ID>

avatar
New Member

@Felix Albani to add this code always worked, when I added help(), it started running forever.