Reply
Explorer
Posts: 25
Registered: ‎01-10-2017

Spark with Yarn how to configure yarn to use all vcores

We are running a spark streaming job using yarn as cluster manager, i have dedicated 7 cores per node to each node ...via yarn-site.xml as shown in the pic below

 

YarnCore.PNG

 

when the job is running ..it's only using 2 vcores and 5 vcores are left alone and the job is slow with lot of batches queued up ..

how can we make it use all the 7 vcores ? that's available to it this is usage when running so that it speed's up our job

 

VcoresStack2.PNG

 

Would greatly appreciate if any of the experts in the community will help out as we are new to Yarn & Spark

 

 

 

 

 

 

Highlighted
Cloudera Employee
Posts: 97
Registered: ‎05-10-2016

Re: Spark with Yarn how to configure yarn to use all vcores

From your first screen shot, you have already maxed yout your memory so you won't be able to allocate more yarn containers.  You may want to lower your spark memory settings or increase your cores per executor when submitting your spark application.