Duplicate issue at: https://community.cloudera.com/t5/Advanced-Analytics-Apache-Spark/spark-error-after-upgrade-to-cdh5-50/m-p/34367/highlight/false#M1238 Notice my comments that this is not exculsively at shutdown. We are consistently seeing it at startup but usually Spark retries (defaults to 3 retries) and associates with Executor actors. This makes me think there is a timing issue but the error messages suggest the Executor actors are actually shuttingdown. Looking at the logs for the executors I see no reason for the shutdown. Perhaps I need to set a particular log level to DEBUG? I have all os org.apache.spark set to DEBUG right now. Maybe it's an Akka log level?
... View more