Created 10-22-2016 07:26 PM
There is nothing native within Spark to handle running queries in parallel. Instead you can take a look at Java concurrency and in particular Futures[1] which will allow you to start queries in parallel and check status later.
1. https://docs.oracle.com/javase/7/docs/api/java/util/concurrent/Future.html