Support Questions
Find answers, ask questions, and share your expertise

Spark: If I use SparkSession, Am I Using Hive Context?

New Contributor

I can use SparkSession to get the list of tables in Hive, or access a Hive table as shown in the code below. Now my question is if in this case, I'm using Spark with Hive Context?

Or is it that to use hive context in Spark, I must directly use HiveContext object to access tables, and perform other Hive related functions?
val personnelTable = spark.catalog.getTable("personnel")

Expert Contributor

I assume you're on Spark 2?

SparkSession, without explicitly creating SparkConf, SparkContext or SQLContext, encapsulates them within itself.

Also SparkSession has merged SQLContext and HiveContext in one object in Spark 2.0.

When building a session object, for example:

val spark = SparkSession .builder() .appName( "SparkSessionZipsExample" ) .config( "spark.sql.warehouse.dir" , warehouseLocation) .enableHiveSupport() .getOrCreate()

.enableHiveSupport() provides HiveContext functions. So you're able to use catalog functions since spark has provided connectivity to hive metastore on doing .enableHiveSupport()

You'll get more clarity by reading this

New Contributor

Thanks for the reply. Does this mean that spark object in spark-shell already has enableHiveSupport() enabled? or the spark.sql(), and spark.catalog that spark object provides are implemented by SparkSession even without enableHiveSupport()?

New Contributor

Yes, spark-shell has enableHiveSupport() already enabled

; ;