Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

Spark and scala

Highlighted

Spark and scala

New Contributor

Hi

Need to know if there are build tools for spark with scala on HDP VM. I do not see one. I need to build a program using scala over spark and need to know to how to build the same

Regards

Sreeni

3 REPLIES 3

Re: Spark and scala

@Sreenivas Adiki

Please see this tutorial guide http://hortonworks.com/hadoop/spark/#section_6

It has details on the tools.

Re: Spark and scala

New Contributor

Step 1: Install older versions of eclipse using below command

Sudo apt-get install eclipse

(or)

Download and Install latest Eclipse release (currently Eclipse Oxygen) from below path :

http://www.eclipse.org/downloads/packages/eclipse-ide-java-developers/oxygen1a

Step 2: At this point, I am assuming that you already installed Java 1.8 JDK on your system

Step 3: Downloaded file will be with eclipse-java-oxygen-1a-linux-gtk-x86_64.tar.gz extension.

You need to extract it using below command :

tar xvfz eclipse-java-oxygen-1a-linux-gtk-x86_64.tar.gz

Step 4: After extracting the tar file, you need to enter into eclipse-installer directory and run below file:

  • cd eclipse-installer/
  • ./eclipse-inst
  • Choose Eclipse IDE for Java Developers in the pop up.
  • Choose the installation folder where you would like to install Eclipse Oxygen.
  • Accept all certificates and now you can launch the Application.

Step 5: You can set the environment variables for JAVA in (~/.bashrc) file as below:

Setup JAVA_HOME Variable as below :

export JAVA_HOME=/opt/jdk1.8.0_91

Setup JRE_HOME Variable as below :

export JRE_HOME=/opt/jdk11.8.0_1/jre

Setup PATH Variable as below :

export PATH=$PATH:/opt/jdk1.8.0_60/bin:/opt/jdk1.8.0_91/jre/bin

Step 6: Now, you can launch your application and need to install SCALA IDE as below:

43453-1.jpg

43455-2.jpg

Step 7: You need to follow instructions and install SCALA IDE for you application. Once it is installed, you will be asked to restart Eclipse. After restart, you can open SCALA perspective as in below screenshot and start programming spark with scala. (You need to set SPARK_HOME and spark bin as environment variables in order to use spark)

43456-3.jpg

Step 8 : Eclipse application with SCALA IDE

43457-4.jpg