Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

Spark unable to connect Hive database in HDP 3.0.1

avatar
Rising Star

Hi Folks,

Hope all are doing well.!!!

I've upgrade HDP 2.6.5 to HDP 3.0.1.0-187 successfully. now i'm trying to connecting hive datbases using spark-shell, i'm unable to see any hive databases. Even i have copied /etc/hive/conf/hive-site.xml to /etc/spark2/conf/ and restarted spark service. After restart spark service, hive-site.xml to original xml file.

Have there any alternative solution to resolve the issue?

Kindly assist me to fix the issue.

1 ACCEPTED SOLUTION

avatar
Super Collaborator

Hi Vinay,

use the below code to connect hive and list the databases :

spark-shell --conf spark.sql.hive.hiveserver2.jdbc.url="jdbc:hive2://hiveserverip:10000/" spark.datasource.hive.warehouse.load.staging.dir="/tmp" spark.hadoop.hive.zookeeper.quorum="zookeeperquoremip:2181" --jars /usr/hdp/current/hive_warehouse_connector/hive-warehouse-connector-assembly-1.0.0.3.0.0.0-1634.jar

val hive = com.hortonworks.spark.sql.hive.llap.HiveWarehouseBuilder.session(spark).build()

hive.showDatabases().show(100, false)

Reference article

https://github.com/hortonworks-spark/spark-llap/tree/master

View solution in original post

33 REPLIES 33

avatar
Expert Contributor

Hi Vinay,

From HDP 3.0 onwards, to work with hive databases you should use the HiveWarehouseConnector library.

Please refer the below documentation.

https://docs.hortonworks.com/HDPDocuments/HDP3/HDP-3.0.0/integrating-hive/content/hive_configure_a_s...

Hope this helps!

avatar
Rising Star

Hi @Sampath Kumar

I have enabled the hive interactive query and added the properties in custom spark2-default configuration file

spark.hadoop.hive.zookeeper.quorum=sidchadoop04.test.com:2181

spark.hadoop.hive.llap.daemon.service.hosts=@llap0

spark.datasource.hive.warehouse.load.staging.dir=/tmp spark.datasource.hive.warehouse.metastoreUri=thrift://sidchadoop04.test.com:9083 spark.sql.hive.hiveserver2.jdbc.url=jdbc:hive2://sidchadoop04.test.com:10500

But still hive database is not accessible by spark-shell.

avatar
Super Collaborator

Hi @Vinay,

Please connect hiveserver2 instead of hiveserver2Interactive by using below syntax:

spark.sql.hive.hiveserver2.jdbc.url=jdbc:hive2://sidchadoop04.test.com:10000

avatar
Rising Star

Hi @subhash parise

I have tried same.. But still hive database is not visible.

avatar
Super Collaborator

Hi @Vinay

please use the below syntax to connect hive from spark:

spark-shell --conf spark.sql.hive.hiveserver2.jdbc.url="jdbc:hive2://************************:10000/" spark.datasource.hive.warehouse.load.staging.dir="/tmp" spark.hadoop.hive.zookeeper.quorum="ip.************:2181" --jars /usr/hdp/current/hive_warehouse_connector/hive-warehouse-connector-assembly-1.0.0.3.0.0.0-1634.jar

Refer the below document for database operations by using spark in hive:

https://github.com/hortonworks-spark/spark-llap/tree/master

Please accept the answer if it help's

Thank you.

avatar
Rising Star

@subhash parise

i have also tried spark-shell --conf spark.sql.hive.hiveserver2.jdbc.url="jdbc:hive2://************************:10000/" spark.datasource.hive.warehouse.load.staging.dir="/tmp" spark.hadoop.hive.zookeeper.quorum="ip.************:2181" --jars /usr/hdp/current/hive_warehouse_connector/hive-warehouse-connector-assembly-1.0.0.3.0.0.0-1634.jar

but same result. Only Default database is visible.

avatar
Master Mentor

@Vinay

This article explains it all Hive Warehouse Connector for accessing Apache Spark data
The Hive Warehouse Connector supports the following applications:

  • Spark shell
  • PySpark
  • The spark-submit script

avatar
Rising Star

avatar
Explorer

hello ,dude how do you load your spark shell in this version ,because i can't access this