Support Questions

Find answers, ask questions, and share your expertise

how to access hive database/tables through spark and pyspark shell?

avatar
Contributor
 
2 REPLIES 2

avatar
Expert Contributor
Hi Harish,

You can create a hive context and can access the hive table.

Example Program:

from pyspark.sql import HiveContext
hive_context = HiveContext(sc)
sample = hive_context.table("default.<tablename>")
sample.show()

Reference Link: https://stackoverflow.com/questions/36051091/query-hive-table-in-pyspark

avatar
Explorer

from spark or pyspark shell use the below commands to access hive database objects.

 

spark.sql("show databases;")

spark.sql("select * from databasename.tablename;")

or

spark.read.table("databasename.tablename")

 

You can give any query inside spark.sql which will give you results.