Support Questions

Find answers, ask questions, and share your expertise
Announcements
Check out our newest addition to the community, the Cloudera Data Analytics (CDA) group hub.

How to install Spark/Scala on Windows ?

Explorer

I'm in an internship position and i have a limited access to a downloads/installation in the system. My problem is how to install Apache spark on Windows 7 (Always I used it via Hortonwork into VM but in my internship i don't have a right to install VM or Hortonworks). I searched more in the forum, I Think that I can use Eclipse, import spark and install scala IDE (scala is my prefered langage with Spark) but I can't arrived to a solution.

In fact, I tried to install scala into Eclipse Juno, but it finished by an error like this:

Software being installed:ScalaSearch0.2.5.v-2_11-201505250900-dd17080 (org.scala.tools.eclipse.search.feature.feature.group 0.2.5.v-2_11-201505250900-dd17080)Missing requirement:ScalaRefactoring0.6.3.2_11-201410271313-539abd5 (org.scala-refactoring.library 0.6.3.2_11-201410271313-539abd5)requires'bundle org.junit 4.11.0' but it could not be found
  Missing requirement:ScalaRefactoring0.6.3.2_11-201501121757-539abd5 (org.scala-refactoring.library 0.6.3.2_11-201501121757-539abd5)requires'bundle org.junit 4.11.0' but it could not be found
  Missing requirement:ScalaRefactoring0.6.3.2_11-201503031801-539abd5 (org.scala-refactoring.library 0.6.3.2_11-201503031801-539abd5)requires'bundle org.junit 4.11.0' but it could not be found
  Cannot satisfy dependency:From:Scala IDE forEclipse4.0.0.v-2_11-201412171518-2279837(org.scala-ide.sdt.feature.feature.group 4.0.0.v-2_11-201412171518-2279837)To: org.scala-refactoring.library [0.6.3.2_11-201410271313-539abd5]

Can you give me any suggestion or idea please ?

2 REPLIES 2

@SMACH H

Try this: http://www.ics.uci.edu/~shantas/Install_Spark_on_Windows10.pdf

It is the same on Windows 7.

The same could work for latest Spark 2.1.

You need to use the proper download link for 2.1

+++

If this is helpful, please vote and accept as the best answer.

Constantin's answer seems a good way to install a packaged release. Do grab the relevant 2.6, 2.7 2.8 version of the windows executables you'll need underneath : https://github.com/steveloughran/winutils . Or, you can set up the windows bits installing HDP2.5 for windows, then turning off any hadoop services it sets to start automatically. That will put the Hadoop 2.7.x binaries up on your classpath.

The other way is to check out and build Spark yourself, which you can just do from maven, or, with an IDE like IntelliJ IDEA, have it import the spark POM and do the build. You'll still need a native windows HADOOP_HOME/bin directory though.

Take a Tour of the Community
Don't have an account?
Your experience may be limited. Sign in to explore more.