Created on 06-18-2014 08:07 PM - edited 09-16-2022 02:00 AM
Thanks.
How to set set up Spark environment in Scala App for CDH4.6?
How to Set Up parameters of sparkHome, master, and masterHostname in the following code?
val jarFile = "/opt/SparkALSApp/out/artifacts/SparkALSApp_jar/SparkALSApp.jar" // "target/scala-2.10/movielens-als_2.10-0.0.jar"
val sparkHome = “???” // "/root/spark"
val master = “???” // Source.fromFile("/root/spark-ec2/cluster-url").mkString.trim
val masterHostname = “???” // Source.fromFile("/root/spark-ec2/masters").mkString.trim
val conf = new SparkConf()
.setMaster(master)
.setSparkHome(sparkHome)
.setAppName("MovieLensALS")
.set("spark.executor.memory", "8g")
.setJars(Seq(jarFile))
val sc = new SparkContext(conf)
Created 06-18-2014 11:14 PM
master is the host:port where the Spark master is running. This will be up to your cluster configuration of course, and I don't know your machine name, but the default port is 18080. masterHostname does not seem used in your code.
sparkHome may not need to be set, but if it does, refers to the /opt/parcels/.../lib/spark directory where CDH is installed.
Created 06-18-2014 11:14 PM
master is the host:port where the Spark master is running. This will be up to your cluster configuration of course, and I don't know your machine name, but the default port is 18080. masterHostname does not seem used in your code.
sparkHome may not need to be set, but if it does, refers to the /opt/parcels/.../lib/spark directory where CDH is installed.
Created 06-20-2014 05:43 AM
Thanks.