Reply
Highlighted
New Contributor
Posts: 1
Registered: ‎05-02-2017

Facing AM Container issues while launching python or jar application on Spark yarn cluster mode.

17/05/02 17:09:01 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
17/05/02 17:09:02 INFO client.RMProxy: Connecting to ResourceManager at /127.0.0.1:8032
17/05/02 17:09:02 INFO yarn.Client: Requesting a new application from cluster with 1 NodeManagers
17/05/02 17:09:02 INFO yarn.Client: Verifying our application has not requested more than the maximum memory capability of the cluster (8192 MB per container)
17/05/02 17:09:02 INFO yarn.Client: Will allocate AM container, with 1408 MB memory including 384 MB overhead
17/05/02 17:09:02 INFO yarn.Client: Setting up container launch context for our AM
17/05/02 17:09:02 INFO yarn.Client: Setting up the launch environment for our AM container
17/05/02 17:09:02 WARN yarn.Client:
SPARK_JAVA_OPTS was detected (set to '-Dspark.driver.port=53411').
This is deprecated in Spark 1.0+.

Please instead use:
 - ./spark-submit with conf/spark-defaults.conf to set defaults for an application
 - ./spark-submit with --driver-java-options to set -X options for a driver
 - spark.executor.extraJavaOptions to set -X options for executors
          
17/05/02 17:09:02 INFO yarn.Client: Preparing resources for our AM container
17/05/02 17:09:02 INFO yarn.Client: Source and destination file systems are the same. Not copying hdfs://master:9000/spark-archive.zip
17/05/02 17:09:02 INFO yarn.Client: Uploading resource file:/home/cdi/Desktop/myJars/wordcount.jar -> hdfs://master:9000/user/cdi/.sparkStaging/application_1493721679584_0004/wordcount.jar
17/05/02 17:09:02 INFO yarn.Client: Uploading resource file:/home/cdi/.ivy2/jars/com.databricks_spark-csv_2.10-1.4.0.jar -> hdfs://master:9000/user/cdi/.sparkStaging/application_1493721679584_0004/com.databricks_spark-csv_2.10-1.4.0.jar
17/05/02 17:09:02 INFO yarn.Client: Uploading resource file:/home/cdi/.ivy2/jars/org.apache.commons_commons-csv-1.1.jar -> hdfs://master:9000/user/cdi/.sparkStaging/application_1493721679584_0004/org.apache.commons_commons-csv-1.1.jar
17/05/02 17:09:03 INFO yarn.Client: Uploading resource file:/home/cdi/.ivy2/jars/com.univocity_univocity-parsers-1.5.1.jar -> hdfs://master:9000/user/cdi/.sparkStaging/application_1493721679584_0004/com.univocity_univocity-parsers-1.5.1.jar
17/05/02 17:09:03 INFO yarn.Client: Uploading resource file:/tmp/spark-4b834889-0538-481a-a62a-3c81a6034dcd/__spark_conf__2878579982192173950.zip -> hdfs://master:9000/user/cdi/.sparkStaging/application_1493721679584_0004/__spark_conf__.zip
17/05/02 17:09:03 INFO spark.SecurityManager: Changing view acls to: cdi
17/05/02 17:09:03 INFO spark.SecurityManager: Changing modify acls to: cdi
17/05/02 17:09:03 INFO spark.SecurityManager: Changing view acls groups to:
17/05/02 17:09:03 INFO spark.SecurityManager: Changing modify acls groups to:
17/05/02 17:09:03 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(cdi); groups with view permissions: Set(); users  with modify permissions: Set(cdi); groups with modify permissions: Set()
17/05/02 17:09:03 INFO yarn.Client: Submitting application application_1493721679584_0004 to ResourceManager
17/05/02 17:09:03 INFO impl.YarnClientImpl: Submitted application application_1493721679584_0004
17/05/02 17:09:04 INFO yarn.Client: Application report for application_1493721679584_0004 (state: ACCEPTED)
17/05/02 17:09:04 INFO yarn.Client:
     client token: N/A
     diagnostics: [Tue May 02 17:09:03 +0530 2017] Application is Activated, waiting for resources to be assigned for AM.  Details : AM Partition = <DEFAULT_PARTITION> ; Partition Resource = <memory:8192, vCores:8> ; Queue's Absolute capacity = 100.0 % ; Queue's Absolute used capacity = 0.0 % ; Queue's Absolute max capacity = 100.0 % ;
     ApplicationMaster host: N/A
     ApplicationMaster RPC port: -1
     queue: default
     start time: 1493725143407
     final status: UNDEFINED
     tracking URL: http://mayur-pc:8088/proxy/application_1493721679584_0004/
     user: cdi
17/05/02 17:09:05 INFO yarn.Client: Application report for application_1493721679584_0004 (state: FAILED)
17/05/02 17:09:05 INFO yarn.Client:
     client token: N/A
     diagnostics: Application application_1493721679584_0004 failed 2 times due to AM Container for appattempt_1493721679584_0004_000002 exited with  exitCode: 1
Failing this attempt.Diagnostics: Exception from container-launch.
Container id: container_1493721679584_0004_02_000001
Exit code: 1
Stack trace: ExitCodeException exitCode=1:
    at org.apache.hadoop.util.Shell.runCommand(Shell.java:972)
    at org.apache.hadoop.util.Shell.run(Shell.java:869)
    at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:1170)
    at org.apache.hadoop.yarn.server.nodemanager.DefaultContainerExecutor.launchContainer(DefaultContainerExecutor.java:236)
    at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:305)
    at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:84)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    at java.lang.Thread.run(Thread.java:745)


Container exited with a non-zero exit code 1
For more detailed output, check the application tracking page: http://mayur-pc:8088/cluster/app/application_1493721679584_0004 Then click on links to logs of each attempt.
. Failing the application.
     ApplicationMaster host: N/A
     ApplicationMaster RPC port: -1
     queue: default
     start time: 1493725143407
     final status: FAILED
     tracking URL: http://mayur-pc:8088/cluster/app/application_1493721679584_0004
     user: cdi
17/05/02 17:09:05 INFO yarn.Client: Deleting staging directory hdfs://master:9000/user/cdi/.sparkStaging/application_1493721679584_0004
Exception in thread "main" org.apache.spark.SparkException: Application application_1493721679584_0004 finished with failed status
    at org.apache.spark.deploy.yarn.Client.run(Client.scala:1132)
    at org.apache.spark.deploy.yarn.Client$.main(Client.scala:1178)
    at org.apache.spark.deploy.yarn.Client.main(Client.scala)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:736)
    at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:185)
    at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:210)
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:124)
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
17/05/02 17:09:05 INFO util.ShutdownHookManager: Shutdown hook called
17/05/02 17:09:05 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-4b834889-0538-481a-a62a-3c81a6034dcd

Announcements