Member since
07-30-2020
3
Posts
0
Kudos Received
0
Solutions
07-30-2020
10:33 PM
PS D:\Spark\spark-2.3.0-bin-hadoop2.7\examples\jars> spark-submit --master yarn --deploy-mode client --class org.apache.spark.examples.SparkPi spark-examples_2.11-2.3.0.jar 1000 2020-07-31 12:59:09 INFO SparkContext:54 - Running Spark version 2.3.0 2020-07-31 12:59:09 INFO SparkContext:54 - Submitted application: Spark Pi 2020-07-31 12:59:09 INFO SecurityManager:54 - Changing view acls to: dgbtds 2020-07-31 12:59:09 INFO SecurityManager:54 - Changing modify acls to: dgbtds 2020-07-31 12:59:09 INFO SecurityManager:54 - Changing view acls groups to: 2020-07-31 12:59:09 INFO SecurityManager:54 - Changing modify acls groups to: 2020-07-31 12:59:09 INFO SecurityManager:54 - SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(dgbtds); groups with view permissions: Set(); users with modify permissions: Set(dgbtds); groups with modify permissions: Set() 2020-07-31 12:59:09 INFO Utils:54 - Successfully started service 'sparkDriver' on port 62303. 2020-07-31 12:59:09 INFO SparkEnv:54 - Registering MapOutputTracker 2020-07-31 12:59:09 INFO SparkEnv:54 - Registering BlockManagerMaster 2020-07-31 12:59:09 INFO BlockManagerMasterEndpoint:54 - Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information 2020-07-31 12:59:09 INFO BlockManagerMasterEndpoint:54 - BlockManagerMasterEndpoint up 2020-07-31 12:59:09 INFO DiskBlockManager:54 - Created local directory at C:\Users\dgbtds\AppData\Local\Temp\blockmgr-50a0ec9b-c9c1-4a6a-b9a2-a880d621e169 2020-07-31 12:59:09 INFO MemoryStore:54 - MemoryStore started with capacity 366.3 MB 2020-07-31 12:59:10 INFO SparkEnv:54 - Registering OutputCommitCoordinator 2020-07-31 12:59:10 INFO log:192 - Logging initialized @2397ms 2020-07-31 12:59:10 INFO Server:346 - jetty-9.3.z-SNAPSHOT 2020-07-31 12:59:10 INFO Server:414 - Started @2462ms 2020-07-31 12:59:10 INFO AbstractConnector:278 - Started ServerConnector@ec2bf82{HTTP/1.1,[http/1.1]}{0.0.0.0:4040} 2020-07-31 12:59:10 INFO Utils:54 - Successfully started service 'SparkUI' on port 4040. 2020-07-31 12:59:10 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@69c43e48{/jobs,null,AVAILABLE,@Spark} 2020-07-31 12:59:10 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@733037{/jobs/json,null,AVAILABLE,@Spark} 2020-07-31 12:59:10 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@7728643a{/jobs/job,null,AVAILABLE,@Spark} 2020-07-31 12:59:10 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@5167268{/jobs/job/json,null,AVAILABLE,@Spark} 2020-07-31 12:59:10 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@1cfd1875{/stages,null,AVAILABLE,@Spark} 2020-07-31 12:59:10 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@28c0b664{/stages/json,null,AVAILABLE,@Spark} 2020-07-31 12:59:10 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@2c444798{/stages/stage,null,AVAILABLE,@Spark} 2020-07-31 12:59:10 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@436390f4{/stages/stage/json,null,AVAILABLE,@Spark} 2020-07-31 12:59:10 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@4d157787{/stages/pool,null,AVAILABLE,@Spark} 2020-07-31 12:59:10 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@68ed96ca{/stages/pool/json,null,AVAILABLE,@Spark} 2020-07-31 12:59:10 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@6d1310f6{/storage,null,AVAILABLE,@Spark} 2020-07-31 12:59:10 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@3228d990{/storage/json,null,AVAILABLE,@Spark} 2020-07-31 12:59:10 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@54e7391d{/storage/rdd,null,AVAILABLE,@Spark} 2020-07-31 12:59:10 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@50b8ae8d{/storage/rdd/json,null,AVAILABLE,@Spark} 2020-07-31 12:59:10 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@255990cc{/environment,null,AVAILABLE,@Spark} 2020-07-31 12:59:10 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@51c929ae{/environment/json,null,AVAILABLE,@Spark} 2020-07-31 12:59:10 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@3c8bdd5b{/executors,null,AVAILABLE,@Spark} 2020-07-31 12:59:10 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@29d2d081{/executors/json,null,AVAILABLE,@Spark} 2020-07-31 12:59:10 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@40e4ea87{/executors/threadDump,null,AVAILABLE,@Spark} 2020-07-31 12:59:10 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@58783f6c{/executors/threadDump/json,null,AVAILABLE,@Spark} 2020-07-31 12:59:10 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@3a7b503d{/static,null,AVAILABLE,@Spark} 2020-07-31 12:59:10 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@10b3df93{/,null,AVAILABLE,@Spark} 2020-07-31 12:59:10 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@ea27e34{/api,null,AVAILABLE,@Spark} 2020-07-31 12:59:10 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@1dfd5f51{/jobs/job/kill,null,AVAILABLE,@Spark} 2020-07-31 12:59:10 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@3c321bdb{/stages/stage/kill,null,AVAILABLE,@Spark} 2020-07-31 12:59:10 INFO SparkUI:54 - Bound SparkUI to 0.0.0.0, and started at http://192.168.124.2:4040 2020-07-31 12:59:10 INFO SparkContext:54 - Added JAR file:/D:/Spark/spark-2.3.0-bin-hadoop2.7/examples/jars/spark-examples_2.11-2.3.0.jar at spark://192.168.124.2:62303/jars/spark-examples_2.11-2.3.0.jar with timestamp 1596171550288 2020-07-31 12:59:10 WARN DomainSocketFactory:117 - The short-circuit local reads feature cannot be used because UNIX Domain sockets are not available on Windows. 2020-07-31 12:59:11 INFO TimelineClientImpl:297 - Timeline service address: http://slave1.spark:8188/ws/v1/timeline/ 2020-07-31 12:59:11 INFO RMProxy:98 - Connecting to ResourceManager at master.spark/192.168.124.199:8050 2020-07-31 12:59:11 INFO Client:54 - Requesting a new application from cluster with 3 NodeManagers 2020-07-31 12:59:11 INFO Client:54 - Verifying our application has not requested more than the maximum memory capability of the cluster (4864 MB per container) 2020-07-31 12:59:11 INFO Client:54 - Will allocate AM container, with 896 MB memory including 384 MB overhead 2020-07-31 12:59:11 INFO Client:54 - Setting up container launch context for our AM 2020-07-31 12:59:11 INFO Client:54 - Setting up the launch environment for our AM container 2020-07-31 12:59:11 INFO Client:54 - Preparing resources for our AM container 2020-07-31 12:59:12 INFO Client:54 - Source and destination file systems are the same. Not copying hdfs://master.spark:8020/Spark_jars/spark-yarn/spark-yarn_2.11-2.3.0.jar 2020-07-31 12:59:12 INFO Client:54 - Uploading resource file:/C:/Users/dgbtds/AppData/Local/Temp/spark-10fa9f0a-eb88-4901-a1be-bcb6f53dbe76/__spark_conf__4369598958010522009.zip -> hdfs://master.spark:8020/user/dgbtds/.sparkStaging/application_1595604923172_0049/__spark_conf__.zip 2020-07-31 12:59:12 INFO SecurityManager:54 - Changing view acls to: dgbtds 2020-07-31 12:59:12 INFO SecurityManager:54 - Changing modify acls to: dgbtds 2020-07-31 12:59:12 INFO SecurityManager:54 - Changing view acls groups to: 2020-07-31 12:59:12 INFO SecurityManager:54 - Changing modify acls groups to: 2020-07-31 12:59:12 INFO SecurityManager:54 - SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(dgbtds); groups with view permissions: Set(); users with modify permissions: Set(dgbtds); groups with modify permissions: Set() 2020-07-31 12:59:12 INFO Client:54 - Submitting application application_1595604923172_0049 to ResourceManager 2020-07-31 12:59:13 INFO YarnClientImpl:273 - Submitted application application_1595604923172_0049 2020-07-31 12:59:13 INFO SchedulerExtensionServices:54 - Starting Yarn extension services with app application_1595604923172_0049 and attemptId None 2020-07-31 12:59:14 INFO Client:54 - Application report for application_1595604923172_0049 (state: FAILED) 2020-07-31 12:59:14 INFO Client:54 - client token: N/A diagnostics: Application application_1595604923172_0049 failed 2 times due to AM Container for appattempt_1595604923172_0049_000002 exited with exitCode: 1 Failing this attempt.Diagnostics: [2020-07-31 12:59:15.166]Exception from container-launch. Container id: container_e07_1595604923172_0049_02_000001 Exit code: 1 [2020-07-31 12:59:15.168]Container exited with a non-zero exit code 1. Error file: prelaunch.err. Last 4096 bytes of prelaunch.err : Last 4096 bytes of stderr : Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/spark/internal/Logging at java.lang.ClassLoader.defineClass1(Native Method) at java.lang.ClassLoader.defineClass(ClassLoader.java:763) at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142) at java.net.URLClassLoader.defineClass(URLClassLoader.java:467) at java.net.URLClassLoader.access$100(URLClassLoader.java:73) at java.net.URLClassLoader$1.run(URLClassLoader.java:368) at java.net.URLClassLoader$1.run(URLClassLoader.java:362) at java.security.AccessController.doPrivileged(Native Method) at java.net.URLClassLoader.findClass(URLClassLoader.java:361) at java.lang.ClassLoader.loadClass(ClassLoader.java:424) at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331) at java.lang.ClassLoader.loadClass(ClassLoader.java:357) at org.apache.spark.deploy.yarn.ExecutorLauncher$.main(ApplicationMaster.scala:854) at org.apache.spark.deploy.yarn.ExecutorLauncher.main(ApplicationMaster.scala) Caused by: java.lang.ClassNotFoundException: org.apache.spark.internal.Logging at java.net.URLClassLoader.findClass(URLClassLoader.java:381) at java.lang.ClassLoader.loadClass(ClassLoader.java:424) at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331) at java.lang.ClassLoader.loadClass(ClassLoader.java:357) ... 14 more [2020-07-31 12:59:15.168]Container exited with a non-zero exit code 1. Error file: prelaunch.err. Last 4096 bytes of prelaunch.err : Last 4096 bytes of stderr : Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/spark/internal/Logging at java.lang.ClassLoader.defineClass1(Native Method) at java.lang.ClassLoader.defineClass(ClassLoader.java:763) at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142) at java.net.URLClassLoader.defineClass(URLClassLoader.java:467) at java.net.URLClassLoader.access$100(URLClassLoader.java:73) at java.net.URLClassLoader$1.run(URLClassLoader.java:368) at java.net.URLClassLoader$1.run(URLClassLoader.java:362) at java.security.AccessController.doPrivileged(Native Method) at java.net.URLClassLoader.findClass(URLClassLoader.java:361) at java.lang.ClassLoader.loadClass(ClassLoader.java:424) at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331) at java.lang.ClassLoader.loadClass(ClassLoader.java:357) at org.apache.spark.deploy.yarn.ExecutorLauncher$.main(ApplicationMaster.scala:854) at org.apache.spark.deploy.yarn.ExecutorLauncher.main(ApplicationMaster.scala) Caused by: java.lang.ClassNotFoundException: org.apache.spark.internal.Logging at java.net.URLClassLoader.findClass(URLClassLoader.java:381) at java.lang.ClassLoader.loadClass(ClassLoader.java:424) at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331) at java.lang.ClassLoader.loadClass(ClassLoader.java:357) ... 14 more For more detailed output, check the application tracking page: http://master.spark:8088/cluster/app/application_1595604923172_0049 Then click on links to logs of each attempt. . Failing the application. ApplicationMaster host: N/A ApplicationMaster RPC port: -1 queue: default start time: 1596171553962 final status: FAILED tracking URL: http://master.spark:8088/cluster/app/application_1595604923172_0049 user: dgbtds 2020-07-31 12:59:14 INFO Client:54 - Deleted staging directory hdfs://master.spark:8020/user/dgbtds/.sparkStaging/application_1595604923172_0049 2020-07-31 12:59:14 ERROR SparkContext:91 - Error initializing SparkContext. org.apache.spark.SparkException: Yarn application has already ended! It might have been killed or unable to launch application master. at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.waitForApplication(YarnClientSchedulerBackend.scala:89) at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(YarnClientSchedulerBackend.scala:63) at org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:164) at org.apache.spark.SparkContext.<init>(SparkContext.scala:500) at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2486) at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:930) at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:921) at scala.Option.getOrElse(Option.scala:121) at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:921) at org.apache.spark.examples.SparkPi$.main(SparkPi.scala:31) at org.apache.spark.examples.SparkPi.main(SparkPi.scala) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52) at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:879) at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:197) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:227) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:136) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) 2020-07-31 12:59:14 INFO AbstractConnector:318 - Stopped Spark@ec2bf82{HTTP/1.1,[http/1.1]}{0.0.0.0:4040} 2020-07-31 12:59:14 INFO SparkUI:54 - Stopped Spark web UI at http://192.168.124.2:4040 2020-07-31 12:59:14 WARN YarnSchedulerBackend$YarnSchedulerEndpoint:66 - Attempted to request executors before the AM has registered! 2020-07-31 12:59:14 INFO YarnClientSchedulerBackend:54 - Shutting down all executors 2020-07-31 12:59:14 INFO YarnSchedulerBackend$YarnDriverEndpoint:54 - Asking each executor to shut down 2020-07-31 12:59:14 INFO SchedulerExtensionServices:54 - Stopping SchedulerExtensionServices (serviceOption=None, services=List(), started=false) 2020-07-31 12:59:14 INFO YarnClientSchedulerBackend:54 - Stopped 2020-07-31 12:59:14 INFO MapOutputTrackerMasterEndpoint:54 - MapOutputTrackerMasterEndpoint stopped! 2020-07-31 12:59:14 INFO MemoryStore:54 - MemoryStore cleared 2020-07-31 12:59:14 INFO BlockManager:54 - BlockManager stopped 2020-07-31 12:59:14 INFO BlockManagerMaster:54 - BlockManagerMaster stopped 2020-07-31 12:59:14 WARN MetricsSystem:66 - Stopping a MetricsSystem that is not running 2020-07-31 12:59:14 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint:54 - OutputCommitCoordinator stopped! 2020-07-31 12:59:14 INFO SparkContext:54 - Successfully stopped SparkContext Exception in thread "main" org.apache.spark.SparkException: Yarn application has already ended! It might have been killed or unable to launch application master. at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.waitForApplication(YarnClientSchedulerBackend.scala:89) at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(YarnClientSchedulerBackend.scala:63) at org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:164) at org.apache.spark.SparkContext.<init>(SparkContext.scala:500) at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2486) at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:930) at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:921) at scala.Option.getOrElse(Option.scala:121) at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:921) at org.apache.spark.examples.SparkPi$.main(SparkPi.scala:31) at org.apache.spark.examples.SparkPi.main(SparkPi.scala) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52) at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:879) at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:197) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:227) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:136) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) 2020-07-31 12:59:14 INFO ShutdownHookManager:54 - Shutdown hook called 2020-07-31 12:59:14 INFO ShutdownHookManager:54 - Deleting directory C:\Users\dgbtds\AppData\Local\Temp\spark-0afb924a-da78-4d5f-b578-98df9ba1413c 2020-07-31 12:59:14 INFO ShutdownHookManager:54 - Deleting directory C:\Users\dgbtds\AppData\Local\Temp\spark-10fa9f0a-eb88-4901-a1be-bcb6f53dbe76 PS D:\Spark\spark-2.3.0-bin-hadoop2.7\examples\jars>
... View more