I am referring to this link - https://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.4.2/bk_yarn_resource_mgt/content/flexible_sched... In this example, two queues have the same resources available. One uses the FIFO ordering policy, and the the other uses the Fair Sharing policy. A user submits three jobs to each queue one right after another, waiting just long enough for each job to start. The first job uses 6x the resource limit in the queue, the second 4x, and last 2x. In the FIFO queue, the 6x job would start and run to completion, then the 4x job would start and run to completion, and then the 2x job. They would start and finish in the order 6x, 4x, 2x. When I try capacity scheduler with default FIFO policy (on Hortonworks Data Platform) - I can see two jobs are running concurrently on same queue, which is not how it supposed to run based on above information.
That tells me that the Resource Manager determined that there were enough resources to run both jobs. Here are a couple of things to keep in mind while using the Capacity Scheduler: