Reply
New Contributor
Posts: 4
Registered: ‎09-06-2016

Unable to use GraphX after AddJar directive

Simple Scala program

spark.version
//%AddJar file:///home/cdsw/.m2/repository/commons-lang/commons-lang/2.4/commons-lang-2.4.jar
import org.apache.spark.graphx.VertexId
val vid= List(78L)
val inRdd =  sc.parallelize(vid);
val vs =inRdd.map{ Tuple2(_, ("Bruce","301"))}
val children=List(79L,80L,81L,82L,83L,84L,85L,86L,100L)

import org.apache.spark.graphx.Edge
def member2OutEdges(vid: VertexId, children:List[Long], attr: Int = 0):Seq[Edge[Int]] = {
  children.map{Edge(vid,_ , attr)}
}

val es= inRdd.flatMap { member2OutEdges(_,children) }

import org.apache.spark.graphx.Graph
val graph=Graph(vs, es)

Gives

spark.version
2.2.0.cloudera1
import org.apache.spark.graphx.VertexId
val vid= List(78L)
val inRdd =  sc.parallelize(vid)
val vs =inRdd.map{ Tuple2(_, ("Bruce","301"))}
val children=List(79L,80L,81L,82L,83L,84L,85L,86L,100L)
import org.apache.spark.graphx.Edge
def member2OutEdges(vid: VertexId, children:List[Long], attr: Int = 0):Seq[Edge[Int]] = {
  children.map{Edge(vid,_ , attr)}
}
val es= inRdd.flatMap { member2OutEdges(_,children) }
import org.apache.spark.graphx.Graph
val graph=Graph(vs, es)

Every thing is good

Un comment line 2

and you get

Name: Compile Error
Message: <console>:63: error: type mismatch;
 found   : org.apache.spark.rdd.RDD[(Long, (String, String))]
 required: org.apache.spark.rdd.RDD[(org.apache.spark.graphx.VertexId, ?)]
    (which expands to)  org.apache.spark.rdd.RDD[(Long, ?)]
Error occurred in an application involving default arguments.
       val graph=Graph(vs, es)
                       ^
<console>:63: error: type mismatch;
 found   : org.apache.spark.rdd.RDD[org.apache.spark.graphx.Edge[Int]]
 required: org.apache.spark.rdd.RDD[org.apache.spark.graphx.Edge[?]]
Error occurred in an application involving default arguments.
       val graph=Graph(vs, es)
                           ^
StackTrace: 

 

New Contributor
Posts: 4
Registered: ‎09-06-2016

Re: Unable to use GraphX after AddJar directive

FYI: this failure does not happen if you add the jar using spark-defaults.conf
spark.jars=/home/cdsw/.m2/repository/commons-lang/commons-lang/2.4/commons-lang-2.4.jar,/home/cdsw/.m2/repository/simple/simple/0.0.1-SNAPSHOT/simple-0.0.1-SNAPSHOT.jar
Announcements