Archives of Support Questions (Read Only)

This is an archived board for historical reference. Information and links may no longer be available or relevant
Announcements
This board is archived and read-only for historical reference. To ask a new question, please post a new topic on the appropriate active board.

SQLContext Error - CreateSchemaRDD

avatar
Expert Contributor

When I am trying to import CreateSchemaRDD from SQL Context it is creating error in spark shell

scala> sc res4: org.apache.spark.SparkContext = org.apache.spark.SparkContext@4345fd45

scala>val sqlContext = new org.apache.spark.sql.SQLContext(sc) sqlContext: org.apache.spark.sql.SQLContext = org.apache.spark.sql.SQLContext@293432e6

scala> import sqlContext.createSchemaRDD <console>:31: error: value createSchemaRDD is not a member of org.apache.spark.sql.SQLContext import sqlContext.createSchemaRDD

however, createSchemaRDD is not shown as a member when I press TAB button, but this is given in the Spark SQL Documentation as a correct command.

Thanks

Sridhar

1 ACCEPTED SOLUTION

avatar
5 REPLIES 5

avatar

Hi Sridhar, can you post what version of Spark you are running and a link to the documentation you're referring to?

avatar
Expert Contributor

Hi Robert, Spark Version is 1.6.0

and the documentation i am refering to is

https://spark.apache.org/docs/1.1.0/sql-programming-guide.html

avatar
Expert Contributor

Hi Robert, it is working correctly if I use

import sqlContext.implicits._

which is correct for spark version 1.6.0

Now I am referring to the documentation 1.6.1

http://spark.apache.org/docs/latest/sql-programming-guide.html

Is this documentation ok or should I refer to any other doc specifically for 1.6.0??

avatar

Sridhar, as long as you're using Spark 1.6 I'd refer to https://spark.apache.org/docs/1.6.1/sql-programming-guide.html

avatar
Expert Contributor

Thanks Robert!