Support Questions

Find answers, ask questions, and share your expertise

23: error: not found: value sqlContext

avatar
Contributor

I am following tutorial a-lap-around-apache-spark, For the section "Using the Spark DataFrame API" I am stucked at below step val df = sqlContext.jsonFile("people.json")

It throws error <console>:23: error: not found: value sqlContext.

Please guide me if you have come across similar issue. I tried few posts but no luck.

Below are actual steps.

1 ACCEPTED SOLUTION

avatar
Master Guru

You are trying to run Spark2 spark-shell, have you done "export SPARK_MAJOR_VERSION=2" ?

View solution in original post

9 REPLIES 9

avatar
Contributor
[spark@sandbox spark2-client]$ hdfs dfs -ls /user/spark
Found 3 items
drwxr-xr-x   - spark hdfs          0 2017-02-11 17:04 /user/spark/.sparkStaging
-rwxrwxrwx   1 spark hdfs         73 2017-02-12 00:21 /user/spark/people.json
-rwxrwxrwx   1 spark hdfs         32 2017-02-12 00:18 /user/spark/people.txt
[spark@sandbox spark2-client]$ ./bin/spark-shell
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel).
17/02/12 03:00:11 WARN SparkContext: Use an existing SparkContext, some configuration may not take effect.
Spark context Web UI available at http://172.17.0.2:4040
Spark context available as 'sc' (master = local[*], app id = local-1486868408718).
Spark session available as 'spark'.
Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 2.0.0.2.5.0.0-1245
      /_/
Using Scala version 2.11.8 (OpenJDK 64-Bit Server VM, Java 1.8.0_111)
Type in expressions to have them evaluated.
Type :help for more information.
scala> val df = sqlContext.jsonFile("people.json")
<console>:23: error: not found: value sqlContext
       val df = sqlContext.jsonFile("people.json")

avatar
Master Guru

You are trying to run Spark2 spark-shell, have you done "export SPARK_MAJOR_VERSION=2" ?

avatar
Contributor

Yes, I used export SPARK MAJOR VERSION=2.

without "_". I guess that is the mistake. I will try with export SPARK_MAJOR_VERSION=2

avatar
Master Guru

Sorry, in Spark2, sqlContext is not created by spark-shell, you can create it by yourself from sc, followed by impor implicits. Then it should work. You can also replace sqlContext by hiveContext

scala> val sqlContext = new org.apache.spark.sql.SQLContext(sc)
scala> import sqlContext.implicits._
scala> sqlContext.sql("show tables").show()
+-----------+-----------+
|  tableName|isTemporary|
+-----------+-----------+
|    flights|      false|
|flights_ext|      false|
+-----------+-----------+

You can also replace sqlContext by hiveContext

scala> val sqlContext = new org.apache.spark.sql.hive.HiveContext(sc)

Actually, in Spark2, you are encouraged to use SparkSessions, see the link for details. It includes sqlContext functionality.

avatar
Contributor
@Predrag Minovic.

Thank you sir. It worked.

avatar
Master Guru

A Spark session called "spark" is created when you run spark-shell in Spark2. You can just say as below. And please accept the answer if it was helpful. Thanks!

scala> spark.sql("show tables").show()
+-----------+-----------+
|  tableName|isTemporary|
+-----------+-----------+
|    flights|      false|
|flights_ext|      false|
+-----------+-----------+

avatar
Rising Star

Thanks So much.. Worked like a charm!!

avatar
Contributor
[spark@sandbox spark2-client]$ echo $SPARK_HOME
/usr/hdp/current/spark2-client
[spark@sandbox spark2-client]$ echo $SPARK_MAJOR_VERSION
2
[spark@sandbox spark2-client]$ ./bin/spark-shell
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel).
17/02/12 03:58:15 WARN SparkContext: Use an existing SparkContext, some configuration may not take effect.
Spark context Web UI available at http://172.17.0.2:4040
Spark context available as 'sc' (master = local[*], app id = local-1486871892126).
Spark session available as 'spark'.
Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 2.0.0.2.5.0.0-1245
      /_/
Using Scala version 2.11.8 (OpenJDK 64-Bit Server VM, Java 1.8.0_111)
Type in expressions to have them evaluated.
Type :help for more information.
scala> val df = sqlContext.jsonFile("people.json")
<console>:23: error: not found: value sqlContext
       val df = sqlContext.jsonFile("people.json")
                ^


@Predrag Minovic I tried setting SPARK_MAJOR_VERSION=2. its not working.

avatar
New Contributor

Unable to replace sql context with hive context.

In scala prgm spark 1.6