Created 03-16-2016 11:40 PM
When I am trying to import CreateSchemaRDD from SQL Context it is creating error in spark shell
scala> sc res4: org.apache.spark.SparkContext = org.apache.spark.SparkContext@4345fd45 scala>val sqlContext = new org.apache.spark.sql.SQLContext(sc) sqlContext: org.apache.spark.sql.SQLContext = org.apache.spark.sql.SQLContext@293432e6 scala> import sqlContext.createSchemaRDD <console>:31: error: value createSchemaRDD is not a member of org.apache.spark.sql.SQLContext import sqlContext.createSchemaRDD |
however, createSchemaRDD is not shown as a member when I press TAB button, but this is given in the Spark SQL Documentation as a correct command.
Thanks
Sridhar
Created 03-17-2016 02:21 AM
Sridhar, as long as you're using Spark 1.6 I'd refer to https://spark.apache.org/docs/1.6.1/sql-programming-guide.html
Created 03-16-2016 11:47 PM
Hi Sridhar, can you post what version of Spark you are running and a link to the documentation you're referring to?
Created 03-16-2016 11:49 PM
Hi Robert, Spark Version is 1.6.0
and the documentation i am refering to is
https://spark.apache.org/docs/1.1.0/sql-programming-guide.html
Created 03-17-2016 12:03 AM
Hi Robert, it is working correctly if I use
import sqlContext.implicits._
which is correct for spark version 1.6.0
Now I am referring to the documentation 1.6.1
http://spark.apache.org/docs/latest/sql-programming-guide.html
Is this documentation ok or should I refer to any other doc specifically for 1.6.0??
Created 03-17-2016 02:21 AM
Sridhar, as long as you're using Spark 1.6 I'd refer to https://spark.apache.org/docs/1.6.1/sql-programming-guide.html
Created 03-18-2016 05:10 AM
Thanks Robert!