Support Questions

Find answers, ask questions, and share your expertise
Check out our newest addition to the community, the Cloudera Data Analytics (CDA) group hub.

how does spark connect with hive jdbc?


I encountered a weired behaviour of hive jdbc. when I create table like

create table my_table (....other good stuffs..)

it gets create inside database named default but when I do

create table mydb.mytable (..stuffss..)

it gets created inside mydb. I am using spark and hive . Previously I would do like

Connection con = DriverManager.getConnection("jdbc:hive2://localhost:10000/", "hiveuser", "hivepassword");
 Statement stmt = con.createStatement();

Here I could specify

DriverManager.getConnection("jdbc:hive2://localhost:10000/mydb", "hiveuser", "hivepassword");

but now I am using spark , so I am doing

SparkSession spark = SparkSession
        .appName("Java Spark SQL basic example")
        .config("spark.sql.warehouse.dir", "hdfs://saurab:9000/user/hive/warehouse")
        .config("hive.metastore.warehouse.dir", "hdfs://saurab:9000/user/hive/warehouse")

I see no configuration to specify database name. That's why I need to explicitly address database if I want to do any CRUD

So how does spark connect to hive ?


Expert Contributor

Hello @Saurab Dahal ,

You should use the hive context object to connect and query hive

Here in the answer you can see a small example


Take a Tour of the Community
Don't have an account?
Your experience may be limited. Sign in to explore more.