Created 04-22-2016 02:07 PM
Created 04-22-2016 02:10 PM
I think that the best option for compiling scala Spark code is to use sbt,which is a tool for managing dependencies. You can do the same with Maven anyway, as you prefer.
Created 04-22-2016 02:10 PM
See http://spark.apache.org/docs/latest/quick-start.html#self-contained-applications "Self-Contained Applications" for all languages
Created 04-22-2016 02:11 PM
Though there are many ways to do that but you can use sbt tool to build your application jar, below is a good example doc to build a jar and run it on spark
https://jaceklaskowski.gitbooks.io/mastering-apache-spark/content/spark-first-app.html
Created 04-22-2016 06:45 PM
In case you are looking for a Maven project to build Spark/Scala. Here is an example https://github.com/vinayshukla/SparkDemo1
Note it was for Spark 1.1.0 but you can change the version.