Support Questions

Find answers, ask questions, and share your expertise

exitCode: 11, (reason: Max number of executor failures (24) reached)

avatar
Rising Star

 Hello All,

 

We are running spark application. And frequently it is getting failed. In the log I see below message.

 

exitCode: 11, (reason: Max number of executor failures (24) reached)

 

And executor is getting failed with below error.

 

Exit status: 1. Diagnostics: Exception from container-launch.
Container id: container_e14_15320282824
Exit code: 1
Stack trace: ExitCodeException exitCode=1: 
	at org.apache.hadoop.util.Shell.runCommand(Shell.java:601)
	at org.apache.hadoop.util.Shell.run(Shell.java:504)
	at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:786)
	at org.apache.hadoop.yarn.server.nodemanager.DefaultContainerExecutor.launchContainer(DefaultContainerExecutor.java:213)
	at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:302)
	at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:82)
	at java.util.concurrent.FutureTask.run(FutureTask.java:262)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
	at java.lang.Thread.run(Thread.java:745)

 

Is there any limit on no. of executor failures  for spark application? I have specified no. of executors as 12.I don't see such parameter in cloudera manager though. Please suggest. 

As per my understanding, due to less memory,executors are getting failed an donce it reaches the max. limit, application is getting killed. We need to increase executor memory in this case. Kindly help.

 

Thanks,

Priya

1 REPLY 1

avatar

you found the solution for this?