Support Questions
Find answers, ask questions, and share your expertise

Issue submit Spark job using RestApi

New Contributor



I'm facing with an issue about Spark. I'm working with CDH 5.7.1 and I would like submit a Spark job to Resource Manager using POST method. So first of all I submit a request for a new application to 


http://<DSN for resource manager>:8088/ws/v1/cluster/apps/new-application?


after I use the id to submit the Spark job to


http://<DSN for resource manager>:8088/ws/v1/cluster/apps?


with a POST request where I upload a Json request with all details of my job.

When I submit request using RestApi I define these environment variable:




CLASSPATH=where I define all paths to get hadoop, hdfs, yarn and mapreduce libs (/usr/lib/hadoop-hdfs/* and /usr/lib/hadoop-hdfs/lib/* ass well)


The job starts to run, but after a little bit it finishes with error message


16/11/30 12:31:48 WARN cluster.YarnSchedulerBackend$YarnSchedulerEndpoint: Container marked as failed: container_1480503607885_0004_01_000120 on host: dev-hdp-01.prometeia. Exit status: 1. Diagnostics: Exception from container-launch.
Container id: container_1480503607885_0004_01_000120
Exit code: 1
Stack trace: ExitCodeException exitCode=1: 
	at org.apache.hadoop.util.Shell.runCommand(
	at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(
	at org.apache.hadoop.yarn.server.nodemanager.DefaultContainerExecutor.launchContainer(
	at java.util.concurrent.ThreadPoolExecutor.runWorker(
	at java.util.concurrent.ThreadPoolExecutor$

Container exited with a non-zero exit code 1


I'm submitting a simple word-count job and if I try to submit the same job using Spark-shell with the command


 spark-submit --class test.SparkWordCount --master yarn /home/ermas/ClouderaTest-0.0.1-SNAPSHOT.jar /ermas/test.log 2


everything go fine.



I'm working with a cluster of 4 nodes with in total 276 Gb memory and 256 cores and my request is of 40 containers with 5 core and 7920mb of memory each one. 



Expert Contributor

It's not recommended to use the YARN rest apis, these are still in the early stages.  The recommended way to submit spark jobs would be to use the spark-submit like you have already tried.  Spark-submit will setup everything necessary including appropriate configurations, classpath, and jar files.  Spark handles local resources a bit differently, so manually setting classpath may be problematic.


If you are looking for a REST service to submit spark jobs, you may want to look into the Livy project[1].  Livy will accept REST request to start a spark job and handles the spark context.


If you are still looking to resolve this problem, you will also need to look at the logs from the worker container.  The log you currently have is from the driver or application master and isn't showing why the container exited.



New Contributor

When you say in "early stages " , what exactly you mean by that .

I have tried using yarn rest api to submit spark job , but i am facing many problems .


java.lang.IllegalArgumentException: Can not create a Path from a null string

I know i am missing some file but dont know where .

Could you please point me to any documentation on using the same .


Thanks In advance