- Subscribe to RSS Feed
- Mark Question as New
- Mark Question as Read
- Float this Question for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page
How to use sqlContext? Getting an error!
- Labels:
-
Apache Spark
Created ‎07-04-2018 10:59 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
sqlcontext.png Please find snapshot attached for the issue.
Created ‎07-04-2018 12:16 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
SqlContext is not available by default in Spark2 shell.
Create a sqlContext using below statement after launching the spark-shell and then you can read the json using this sqlContext.
val sqlContext = new org.apache.spark.sql.SQLContext(sc)
Let me know if this helps.
Created ‎07-04-2018 11:05 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Are you using Spark2?
Have you set the Env variable as following before opening the shell?
# export SPARK_MAJOR_VERSION=2 # spark-shell
.
Created ‎07-04-2018 11:12 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Yes, using spark2. It is by default using Spark2. What else should I try? @Jay Kumar SenSharma
Created ‎07-04-2018 11:14 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Created ‎07-04-2018 11:18 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
val rd = sqlContext.read.json("/user/root/Example2.JSON"); //Used this command.
<console>:23: error: not found: value sqlContext
val rd = sqlContext.read.json("/user/root/Example2.JSON");
^ //Got this output @Jay Kumar SenSharm
Created ‎07-04-2018 11:27 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
How are you logging in to the Shell? Which command are you using to login in to the spark shell ? Which mode?
For example when you run "spark-shell" then you will see lots of output ... we need that complete output.
Example
# spark-shell <br>
.
As much information you can provide about your issue ... there are possibilities that your issue will get resolved faster. Else there will be more interactions to know which HDP version are you using ? What is the logging happening? ...etc
Created ‎07-04-2018 12:16 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
SqlContext is not available by default in Spark2 shell.
Create a sqlContext using below statement after launching the spark-shell and then you can read the json using this sqlContext.
val sqlContext = new org.apache.spark.sql.SQLContext(sc)
Let me know if this helps.
Created ‎07-04-2018 01:08 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Thanks! It worked.
Created ‎07-04-2018 12:25 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
val sc: SparkContext // An existing SparkContext. val sqlContext = new org.apache.spark.sql.SQLContext(sc) val rd = sqlContext.read.json("user/root/Example2.JSON")
Created ‎07-04-2018 01:09 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Thank you!
