Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

Able to run in spark-cluster but not YARN cluster

Able to run in spark-cluster but not YARN cluster

New Contributor

I am following the example here:

https://databricks-training.s3.amazonaws.com/movie-recommendation-with-mllib.html

 

Using cloudera CDH5.1.2, with spark 1.0.0, I managed to run in spark-cluster mode using the following command:

spark-submit --driver-memory 2g --class MovieLensALS --master spark://mysparkmaster:7077 target/scala-2.10/movielens-als-assembly-0.1.jar <parameters...>

 

However, when I tried to run in YARN using this command line:

spark-submit --driver-memory 2g --class MovieLensALS --deploy-mode cluster --master yarn target/scala-2.10/movielens-als-assembly-0.1.jar <parameters...>

 

All I get is a failed job with the following exception:

WARN YarnClusterScheduler: Initial job has not accepted any resources; check your cluster UI to ensure that workers are registered and have sufficient memory

 

Any pointers would be greatly appreciated.

Don't have an account?
Coming from Hortonworks? Activate your account here