Archives of Support Questions (Read Only)

This is an archived board for historical reference. Information and links may no longer be available or relevant
Announcements
This board is archived and read-only for historical reference. To ask a new question, please post a new topic on the appropriate active board.

Spark sbt tool

avatar
Explorer

I've been using the Spark shell on a CDH5 cluster and now I'd like to use the sbt tool to create a standalone Scala program to be deployed on the cluster, but I can't locate where the sbt tool is installed. Is sbt included in CDH5 and if so where is it installed?

 

Thanks.

Stefan

1 ACCEPTED SOLUTION

avatar
Master Collaborator

No, sbt and scala are not installed. SBT is a build tool and used at development time, rather than at runtime -- it's a lot like Maven in this respect. You would use SBT (or Maven) on your development machine to create a .jar file, and run that on your cluster.

View solution in original post

2 REPLIES 2

avatar
Master Collaborator

No, sbt and scala are not installed. SBT is a build tool and used at development time, rather than at runtime -- it's a lot like Maven in this respect. You would use SBT (or Maven) on your development machine to create a .jar file, and run that on your cluster.

avatar
Explorer

Thanks.

Stefan