16/02/16 17:55:25 INFO Client: Application report for application_1455610402042_0021 (state: ACCEPTED) 16/02/16 17:55:26 INFO Client: Application report for application_1455610402042_0021 (state: ACCEPTED) 16/02/16 17:55:27 INFO Client: Application report for application_1455610402042_0021 (state: ACCEPTED) 16/02/16 17:55:28 INFO Client: Application report for application_1455610402042_0021 (state: FAILED) 16/02/16 17:55:28 INFO Client: client token: N/A diagnostics: Application application_1455610402042_0021 failed 2 times due to AM Container for appattempt_1455610402042_0021_000002 exited with exitCode: 10 For more detailed output, check application tracking page:http://w-namenode1.iwi.unisg.ch:8088/cluster/app/application_1455610402042_0021Then, click on links to logs of each attempt. Diagnostics: Exception from container-launch. Container id: container_e12_1455610402042_0021_02_000001 Exit code: 10 Stack trace: ExitCodeException exitCode=10: at org.apache.hadoop.util.Shell.runCommand(Shell.java:576) at org.apache.hadoop.util.Shell.run(Shell.java:487) at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:753) at org.apache.hadoop.yarn.server.nodemanager.DefaultContainerExecutor.launchContainer(DefaultContainerExecutor.java:212) at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:302) at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:82) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at java.lang.Thread.run(Thread.java:745) Container exited with a non-zero exit code 10 Failing this attempt. Failing the application. ApplicationMaster host: N/A ApplicationMaster RPC port: -1 queue: default start time: 1455641459929 final status: FAILED tracking URL: http://w-namenode1.iwi.unisg.ch:8088/cluster/app/application_1455610402042_0021 user: spark 16/02/16 17:55:28 ERROR SparkContext: Error initializing SparkContext. org.apache.spark.SparkException: Yarn application has already ended! It might have been killed or unable to launch application master. at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.waitForApplication(YarnClientSchedulerBackend.scala:125) at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(YarnClientSchedulerBackend.scala:65) at org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:144) at org.apache.spark.SparkContext.(SparkContext.scala:523) at org.apache.spark.examples.SparkPi$.main(SparkPi.scala:29) at org.apache.spark.examples.SparkPi.main(SparkPi.scala) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:497) at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:685) at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) 16/02/16 17:55:29 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage/kill,null} 16/02/16 17:55:29 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/api,null} 16/02/16 17:55:29 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/,null} 16/02/16 17:55:29 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/static,null} 16/02/16 17:55:29 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/threadDump/json,null} 16/02/16 17:55:29 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/threadDump,null} 16/02/16 17:55:29 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/json,null} 16/02/16 17:55:29 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors,null} 16/02/16 17:55:29 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/environment/json,null} 16/02/16 17:55:29 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/environment,null} 16/02/16 17:55:29 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/rdd/json,null} 16/02/16 17:55:29 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/rdd,null} 16/02/16 17:55:29 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/json,null} 16/02/16 17:55:29 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage,null} 16/02/16 17:55:29 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/pool/json,null} 16/02/16 17:55:29 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/pool,null} 16/02/16 17:55:29 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage/json,null} 16/02/16 17:55:29 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage,null} 16/02/16 17:55:29 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/json,null} 16/02/16 17:55:29 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages,null} 16/02/16 17:55:29 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/job/json,null} 16/02/16 17:55:29 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/job,null} 16/02/16 17:55:29 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/json,null} 16/02/16 17:55:29 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs,null} 16/02/16 17:55:29 INFO SparkUI: Stopped Spark web UI at http://10.0.104.219:4040 16/02/16 17:55:29 INFO DAGScheduler: Stopping DAGScheduler 16/02/16 17:55:29 INFO YarnClientSchedulerBackend: Shutting down all executors 16/02/16 17:55:29 INFO YarnClientSchedulerBackend: Asking each executor to shut down 16/02/16 17:55:29 INFO YarnExtensionServices: Stopping org.apache.spark.scheduler.cluster.YarnExtensionServices@6fbb4061 16/02/16 17:55:29 INFO YarnHistoryService: Stopping dequeue service, final queue size is 0 16/02/16 17:55:29 INFO YarnClientSchedulerBackend: Stopped 16/02/16 17:55:29 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped! 16/02/16 17:55:29 ERROR Utils: Uncaught exception in thread main java.lang.NullPointerException at org.apache.spark.network.netty.NettyBlockTransferService.close(NettyBlockTransferService.scala:152) at org.apache.spark.storage.BlockManager.stop(BlockManager.scala:1228) at org.apache.spark.SparkEnv.stop(SparkEnv.scala:100) at org.apache.spark.SparkContext$$anonfun$stop$12.apply$mcV$sp(SparkContext.scala:1749) at org.apache.spark.util.Utils$.tryLogNonFatalError(Utils.scala:1185) at org.apache.spark.SparkContext.stop(SparkContext.scala:1748) at org.apache.spark.SparkContext.(SparkContext.scala:593) at org.apache.spark.examples.SparkPi$.main(SparkPi.scala:29) at org.apache.spark.examples.SparkPi.main(SparkPi.scala) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:497) at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:685) at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) 16/02/16 17:55:29 INFO SparkContext: Successfully stopped SparkContext Exception in thread "main" org.apache.spark.SparkException: Yarn application has already ended! It might have been killed or unable to launch application master. at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.waitForApplication(YarnClientSchedulerBackend.scala:125) at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(YarnClientSchedulerBackend.scala:65) at org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:144) at org.apache.spark.SparkContext.(SparkContext.scala:523) at org.apache.spark.examples.SparkPi$.main(SparkPi.scala:29) at org.apache.spark.examples.SparkPi.main(SparkPi.scala) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:497) at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:685) at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) 16/02/16 17:55:29 INFO DiskBlockManager: Shutdown hook called 16/02/16 17:55:29 INFO ShutdownHookManager: Shutdown hook called 16/02/16 17:55:29 INFO ShutdownHookManager: Deleting directory /tmp/spark-14bb1157-525a-41b7-8be5-36ed5ef6e8b8/userFiles-74da93c0-0167-4a41-b12f-7d1ab99de28a 16/02/16 17:55:29 INFO ShutdownHookManager: Deleting directory /tmp/spark-14bb1157-525a-41b7-8be5-36ed5ef6e8b8