Archives of Support Questions (Read Only)

This is an archived board for historical reference. Information and links may no longer be available or relevant
Announcements
This board is archived and read-only for historical reference. To ask a new question, please post a new topic on the appropriate active board.

Who agreed with this solution

avatar
Contributor

Ok, I finally fixed the issue. 2 things needed to be done:

 

1- Import implicits:

      Note that this should be done only after an instance of org.apache.spark.sql.SQLContext is created. It should be written as:

      val sqlContext= new org.apache.spark.sql.SQLContext(sc)
      import sqlContext.implicits._

 

 

2- Move case class outside of the method:

      case class, by use of which you define the schema of the DataFrame, should be defined outside of the method needing it. You can read more about it here:

      https://issues.scala-lang.org/browse/SI-6649

 

Cheers.

View solution in original post

Who agreed with this solution