Support Questions

Find answers, ask questions, and share your expertise

Issue on running spark application in Yarn-cluster mode

avatar
New Contributor

Code deos not have any jar files, I have provided the python folders as zip and using following command to run the code. 

spark2-submit --queue abc --master yarn --deploy-mode cluster --num-executors 5 --executor-cores 5 --executor-memory 20G --driver-memory 5g --conf spark.yarn.executor.memoryOverhead=4096 --conf spark.sql.shuffle.partitions=400 --conf spark.driver.maxResultSize=0 --conf spark.scheduler.mode=FAIR --conf spark.serializer=org.apache.spark.serializer.KryoSerializer --conf spark.kryoserializer.buffer.max=512m --conf spark.executor.heartbeatInterval=100 --conf spark.sql.autoBroadcastJoinThreshold=-1 --conf spark.sql.broadcastTimeout=-1 --py-files /abc/python/dependencies.zip,/abc/python/modules.zip /abc/python/main.py

 

Following is the error:

Exit code: 13
Shell output: main : command provided 1
main : run as user is ***
main : requested yarn user is***
Getting exit code file...
Creating script paths...
Writing pid file...
Writing to tmp file /
Writing to cgroup task files...
Creating local dirs...
Launching container...


[2022-06-21 07:29:57.254]Container exited with a non-zero exit code 13. Error file: prelaunch.err.
Last 4096 bytes of prelaunch.err :
Last 4096 bytes of stderr :
22/06/21 07:29:53 INFO util.SignalUtils: Registered signal handler for TERM
22/06/21 07:29:53 INFO util.SignalUtils: Registered signal handler for HUP
22/06/21 07:29:53 INFO util.SignalUtils: Registered signal handler for INT
22/06/21 07:29:54 WARN spark.SparkConf: The configuration key 'spark.yarn.executor.memoryOverhead' has been deprecated as of Spark 2.3 and may be removed in the future. Please use the new key 'spark.executor.memoryOverhead' instead.
22/06/21 07:29:54 INFO spark.SecurityManager: Changing view acls to: ****
22/06/21 07:29:54 INFO spark.SecurityManager: Changing modify acls to: ***
22/06/21 07:29:54 INFO spark.SecurityManager: Changing view acls groups to:
22/06/21 07:29:54 INFO spark.SecurityManager: Changing modify acls groups to:
22/06/21 07:29:54 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls enabled; users with view permissions: Set(***, *); groups with view permissions: Set(); users with modify permissions: Set(***); groups with modify permissions: Set()
22/06/21 07:29:54 INFO yarn.ApplicationMaster: ApplicationAttemptId: appattempt_1653193227336_217585_000002
22/06/21 07:29:54 INFO yarn.ApplicationMaster: Starting the user application in a separate Thread
22/06/21 07:29:54 INFO yarn.ApplicationMaster: Waiting for spark context initialization...
22/06/21 07:29:54 WARN spark.SparkConf: The configuration key 'spark.yarn.executor.memoryOverhead' has been deprecated as of Spark 2.3 and may be removed in the future. Please use the new key 'spark.executor.memoryOverhead' instead.
22/06/21 07:29:55 ERROR yarn.ApplicationMaster: User application exited with status 1
22/06/21 07:29:55 INFO yarn.ApplicationMaster: Final app status: FAILED, exitCode: 13, (reason: User application exited with status 1)
22/06/21 07:29:55 ERROR yarn.ApplicationMaster: Uncaught exception:
org.apache.spark.SparkException: Exception thrown in awaitResult:
at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:226)
at org.apache.spark.deploy.yarn.ApplicationMaster.runDriver(ApplicationMaster.scala:448)
at org.apache.spark.deploy.yarn.ApplicationMaster.run(ApplicationMaster.scala:276)
at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$3.run(ApplicationMaster.scala:821)
at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$3.run(ApplicationMaster.scala:820)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)

1 REPLY 1

avatar
Master Collaborator

Hi @shraddha 

 

Could you please check by any chance if you have set master as local while creating SparkSession in your code.

 

Use the following sample code to run locally and cluster without updating the master value. 

val appName = "MySparkApp"
        
// Creating the SparkConf object
val sparkConf = new SparkConf().setAppName(appName).setIfMissing("spark.master", "local[2]")
    
// Creating the SparkSession object
val spark: SparkSession = SparkSession.builder().config(sparkConf).getOrCreate()

 

Verify the whole logs once again to check is there any others errors.