- Subscribe to RSS Feed
- Mark Question as New
- Mark Question as Read
- Float this Question for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page
Spark sbt tool
- Labels:
-
Apache Spark
Created on ‎04-12-2014 07:32 AM - edited ‎09-16-2022 01:57 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I've been using the Spark shell on a CDH5 cluster and now I'd like to use the sbt tool to create a standalone Scala program to be deployed on the cluster, but I can't locate where the sbt tool is installed. Is sbt included in CDH5 and if so where is it installed?
Thanks.
Stefan
Created ‎04-12-2014 07:37 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
No, sbt and scala are not installed. SBT is a build tool and used at development time, rather than at runtime -- it's a lot like Maven in this respect. You would use SBT (or Maven) on your development machine to create a .jar file, and run that on your cluster.
Created ‎04-12-2014 07:37 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
No, sbt and scala are not installed. SBT is a build tool and used at development time, rather than at runtime -- it's a lot like Maven in this respect. You would use SBT (or Maven) on your development machine to create a .jar file, and run that on your cluster.
Created ‎04-12-2014 10:03 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Thanks.
Stefan
