Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

How to use sqlContext? Getting an error!

avatar

sqlcontext.png Please find snapshot attached for the issue.

1 ACCEPTED SOLUTION

avatar
@sudhir reddy

SqlContext is not available by default in Spark2 shell.

Create a sqlContext using below statement after launching the spark-shell and then you can read the json using this sqlContext.

val sqlContext = new org.apache.spark.sql.SQLContext(sc)

Let me know if this helps.

View solution in original post

10 REPLIES 10

avatar
Master Mentor

@sudhir reddy

Are you using Spark2?

Have you set the Env variable as following before opening the shell?

# export SPARK_MAJOR_VERSION=2
# spark-shell

.

avatar

Yes, using spark2. It is by default using Spark2. What else should I try? @Jay Kumar SenSharma

avatar
Master Mentor

@sudhir reddy

Can you please share the complete output along with the command that you ran?

avatar

val rd = sqlContext.read.json("/user/root/Example2.JSON"); //Used this command.

<console>:23: error: not found: value sqlContext val rd = sqlContext.read.json("/user/root/Example2.JSON"); ^ //Got this output @Jay Kumar SenSharm

avatar
Master Mentor

@sudhir reddy

How are you logging in to the Shell? Which command are you using to login in to the spark shell ? Which mode?

For example when you run "spark-shell" then you will see lots of output ... we need that complete output.

Example

# spark-shell <br>

.

As much information you can provide about your issue ... there are possibilities that your issue will get resolved faster. Else there will be more interactions to know which HDP version are you using ? What is the logging happening? ...etc

avatar
@sudhir reddy

SqlContext is not available by default in Spark2 shell.

Create a sqlContext using below statement after launching the spark-shell and then you can read the json using this sqlContext.

val sqlContext = new org.apache.spark.sql.SQLContext(sc)

Let me know if this helps.

avatar

Thanks! It worked.

avatar
New Contributor
val sc: SparkContext // An existing SparkContext.
val sqlContext = new org.apache.spark.sql.SQLContext(sc)
val rd = sqlContext.read.json("user/root/Example2.JSON")

avatar

Thank you!