Support Questions

Find answers, ask questions, and share your expertise

Tutorial on Building Spark Development Environment


Hi all,


I've been combing through the Cloudera documentation and found some good tutorials on developing Spark jobs using Scala, however I can't seem to find a good tutorial geared towards my scenario.


I'm attempting to compile a "*.scala" file into a "*.jar" file using maven, however since I'm performing this on the edge node and there is no Internet access, I'm running into some challenges.  


Can anyone recommend either a good tutorial or how I should approach this?  I downloaded Eclipse and was about to start attempting to develope on my own (Windows) workstation, but I'd prefer not to get to far along if that's the wrong course of action.


Any advice/assistance is greatly appeciated.






Can you provide more details about what you are trying to do and the challenges you are having? I'm in a similar situation with a cluster that has no Internet access. Are you trying to set up a local Maven repository?


Sorry, I didn't notice you had responded to my question.... If you are still looking for information on this, I ended up compiling the programs using an IDE (have used both Eclipse and IntelliJ) and then copy over the pre-compiled JAR file to the associated Cloudera environment for execution using spark2-submit.


Let me know if you would like additional information.



Take a Tour of the Community
Don't have an account?
Your experience may be limited. Sign in to explore more.