Created 12-26-2018 07:23 AM
Hi Folks,
Hope all are doing well.!!!
I've upgrade HDP 2.6.5 to HDP 3.0.1.0-187 successfully. now i'm trying to connecting hive datbases using spark-shell, i'm unable to see any hive databases. Even i have copied /etc/hive/conf/hive-site.xml to /etc/spark2/conf/ and restarted spark service. After restart spark service, hive-site.xml to original xml file.
Have there any alternative solution to resolve the issue?
Kindly assist me to fix the issue.
Created 01-07-2019 06:17 AM
Hi Vinay,
use the below code to connect hive and list the databases :
spark-shell --conf spark.sql.hive.hiveserver2.jdbc.url="jdbc:hive2://hiveserverip:10000/" spark.datasource.hive.warehouse.load.staging.dir="/tmp" spark.hadoop.hive.zookeeper.quorum="zookeeperquoremip:2181" --jars /usr/hdp/current/hive_warehouse_connector/hive-warehouse-connector-assembly-1.0.0.3.0.0.0-1634.jar
val hive = com.hortonworks.spark.sql.hive.llap.HiveWarehouseBuilder.session(spark).build()
hive.showDatabases().show(100, false)
Reference article
Created 01-02-2019 11:49 AM
Yes, you need to enable Interactive query.
Did you follow these steps LLAP & Interactive query
Remember also to enable YARN pre-emption via YARN config
HTH
Created 01-02-2019 02:24 PM
Ohh. I did not enable the pre-emption via yarn config, It is only point which is pending. Rest of part, i have completed.
let me check with enable yarn pre-emption. Will update you once done it.
Created 01-03-2019 05:34 AM
No luck. Pre-emption is already enabled via yarn config and all other prerequisite has completed. Hive interactive query service is running fine. Still
19/01/03 05:16:45 INFO RetryingMetaStoreClient: RetryingMetaStoreClient proxy=class org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient ugi=vinay@TEST.COM (auth:KERBEROS) retries=1 delay=5 lifetime=0
19/01/03 05:16:47 INFO CodeGenerator: Code generated in 294.781928 ms
19/01/03 05:16:47 INFO CodeGenerator: Code generated in 18.011739 ms
+------------+ |databaseName| +
------------+ | default| +------------+
Created 01-03-2019 10:10 AM
@Vinay
So now the interactive query is running fine and it no longer throws errors, except that you can't see the other databases except the "DEFAULT" ?
IN HDP 3.0 spark uses its own separate catalog this should explain why can't see any hive databases. ToYou should use the HiveWarehouseConnector. work with hive databases please follow this documentation Configuring hiveWarehouseConnector
Please revert
HTH
Created 01-04-2019 07:42 AM
Yes interactive query is running fine.
i have edited below properties in custom spark2-default configuration:
spark.sql.hive.hiveserver2.jdbc.url.principal
spark.hadoop.hive.zookeeper.quorum
spark.hadoop.hive.llap.daemon.service.hosts
spark.datasource.hive.warehouse.load.staging.dir
spark.datasource.hive.warehouse.metastoreUri
spark.sql.hive.hiveserver2.jdbc.url
After taken restart.
run the spark-shell
sql("show databases").show()
still only DEFAULT database is visible.
Created 01-07-2019 07:31 AM
Thanks @Geoffrey Shelton Okot
Created 01-07-2019 06:17 AM
Hi Vinay,
use the below code to connect hive and list the databases :
spark-shell --conf spark.sql.hive.hiveserver2.jdbc.url="jdbc:hive2://hiveserverip:10000/" spark.datasource.hive.warehouse.load.staging.dir="/tmp" spark.hadoop.hive.zookeeper.quorum="zookeeperquoremip:2181" --jars /usr/hdp/current/hive_warehouse_connector/hive-warehouse-connector-assembly-1.0.0.3.0.0.0-1634.jar
val hive = com.hortonworks.spark.sql.hive.llap.HiveWarehouseBuilder.session(spark).build()
hive.showDatabases().show(100, false)
Reference article
Created 01-07-2019 07:31 AM
Thanks @subhash parise
Created 01-07-2019 09:14 AM
Nice it worked out but the solution wasn't far!
Created 01-10-2019 08:47 AM
Almost, we had done. Thanks again @Geoffrey Shelton Okot