i am not able to submit a Spark job. The error is: YarnScheduler: Initial job has not accepted any resources; check your cluster UI to ensure that workers are registered and have sufficient resources
I submit the application with the following command: spark-submit --class util.Main --master yarn-client --executor-memory 512m --executor-cores 2 my.jar my_config
I installed Apache Ambari Version220.127.116.11 and ResourceManager version:18.104.22.168.5.0.0 on Ubuntu 14.04.
Which should be the cause of the issue?
Your cluster has allocated the resources that you have asked for the Spark. It looks like you have a backlog of jobs that are already running. May be there is a job that is stuck in the queue. Kill all the previous applications and see if the job runs. You can also kill all the yarn applications and resubmit the jobs.
$ yarn application -list $ yarn application -kill $application_id
Thanks for the reply. I already executed the two above commands and YARN is listing 0 application. Then i executed again the job and i am facing two different situations:
The job hangs on:
INFO Client: Application report for application_1480498999425_0002 (state: ACCEPTED)
The job starts (RUNNING STATUS) but when executing the first spark jobs it stops with the following error:
YarnScheduler: Initial job has not accepted any resources; check your cluster UI to ensure that workers are registered and have sufficient resources
I've seen this issue quite often when folks are first setting up their cluster. Make sure you have node managers on all of your data nodes. In addition check the YARN configuration for MAX container size. If it is less than what you are requesting, resources will never get allocated. And finally, check the default number of executors. Try specifying --num-executors 2. If you request more resources than your cluster has (more vcores or ram), then you will get this issue.