Support Questions

Find answers, ask questions, and share your expertise

How to set set up Spark environment in Scala App for CDH4.6

avatar
Explorer

Thanks.

How to set set up Spark environment in Scala App for CDH4.6?

How to Set Up parameters of sparkHome, master, and masterHostname in the following code?

 

val jarFile = "/opt/SparkALSApp/out/artifacts/SparkALSApp_jar/SparkALSApp.jar" // "target/scala-2.10/movielens-als_2.10-0.0.jar"
    val sparkHome = “???” // "/root/spark"
    val master = “???” // Source.fromFile("/root/spark-ec2/cluster-url").mkString.trim
    val masterHostname = “???” // Source.fromFile("/root/spark-ec2/masters").mkString.trim
    val conf = new SparkConf()
      .setMaster(master)
      .setSparkHome(sparkHome)
      .setAppName("MovieLensALS")
      .set("spark.executor.memory", "8g")
      .setJars(Seq(jarFile))
    val sc = new SparkContext(conf)

Xuesong
1 ACCEPTED SOLUTION

avatar
Master Collaborator

master is the host:port where the Spark master is running. This will be up to your cluster configuration of course, and I don't know your machine name, but the default port is 18080. masterHostname does not seem used in your code.

 

sparkHome may not need to be set, but if it does, refers to the /opt/parcels/.../lib/spark directory where CDH is installed.

View solution in original post

2 REPLIES 2

avatar
Master Collaborator

master is the host:port where the Spark master is running. This will be up to your cluster configuration of course, and I don't know your machine name, but the default port is 18080. masterHostname does not seem used in your code.

 

sparkHome may not need to be set, but if it does, refers to the /opt/parcels/.../lib/spark directory where CDH is installed.

avatar
Explorer

Thanks.

Xuesong