Support Questions

Find answers, ask questions, and share your expertise

Who agreed with this topic

Spark/Scala Error: value toDF is not a member of org.apache.spark.rdd.RDD

avatar
Contributor

Hi all,

I am trying to create a DataFrame of a text file which gives me error: "value toDF is not a member of org.apache.spark.rdd.RDD"

 

The only solution I can find online is to import SQLContext.implicits._ which in trun throws "not found: value SQLContext"

 

I googled this new error but couldn't find anything. The funny part is that the piece of code I am using works in Spark-Shell, but fails when I try to build it using sbt package

I am suing Cloudera's QuickStart VM and My Spark Version is 1.3.0 and my Scala Version: 2.10.4 .

 

Any help is highly appreciated,

Cheers.

 

Here comes my piece of code:

 

 

import...........

import SQLContext.implicits._

...

class Class_1() extends Runnable {
val conf = new SparkConf().setAppName("TestApp")
val sc = new SparkContext(conf)

val sqlContext= new org.apache.spark.sql.SQLContext(sc)
var fDimCustomer = sc.textFile("DimCustomer.txt")

 

def loadData(fileName:String) {

fDimCustomer = sc.textFile("DimCustomer.txt")

case class DimC(ID:Int, Name:String)
var dimCustomer1 = fDimCustomer.map(_.split(',')).map(r=>DimC(r(0).toInt,r(1))).toDF
dimCustomer1.registerTempTable("Cust_1")

val customers = sqlContext.sql("select * from Cust_1")
customers.show()

}

......

Who agreed with this topic