For scenarios involving data processing, I would want to use Spark as a tool of choice, would like to know, if we would be required to use sbt to build the spark program and use spark-submit? If so, will be provided with template for the project?
Not sure how are we expected to run the spark code, are we supposed to run it from spark-shell or to build a jar and submit using spark-submit. If later be the case, will we be given a template project to build upon? This shall clarify if it is practically viable to use spark for transformations and processing given the time constraints.