Support Questions

Find answers, ask questions, and share your expertise

how does spark connect with hive jdbc?

avatar
Contributor

I encountered a weired behaviour of hive jdbc. when I create table like

create table my_table (....other good stuffs..)

it gets create inside database named default but when I do

create table mydb.mytable (..stuffss..)

it gets created inside mydb. I am using spark and hive . Previously I would do like

Connection con = DriverManager.getConnection("jdbc:hive2://localhost:10000/", "hiveuser", "hivepassword");
 Statement stmt = con.createStatement();

Here I could specify

DriverManager.getConnection("jdbc:hive2://localhost:10000/mydb", "hiveuser", "hivepassword");

but now I am using spark , so I am doing

SparkSession spark = SparkSession
        .builder()
        .appName("Java Spark SQL basic example")
        .enableHiveSupport()
        .config("spark.sql.warehouse.dir", "hdfs://saurab:9000/user/hive/warehouse")
        .config("hive.metastore.warehouse.dir", "hdfs://saurab:9000/user/hive/warehouse")
        .master("local")
        .getOrCreate();

I see no configuration to specify database name. That's why I need to explicitly address database if I want to do any CRUD

So how does spark connect to hive ?

1 REPLY 1

avatar
Expert Contributor

Hello @Saurab Dahal ,

You should use the hive context object to connect and query hive

Here in the answer you can see a small example

https://community.hortonworks.com/questions/93392/how-to-connect-and-run-hive-query-from-apache-spar...

Michel