Created 12-26-2018 07:23 AM
Hi Folks,
Hope all are doing well.!!!
I've upgrade HDP 2.6.5 to HDP 3.0.1.0-187 successfully. now i'm trying to connecting hive datbases using spark-shell, i'm unable to see any hive databases. Even i have copied /etc/hive/conf/hive-site.xml to /etc/spark2/conf/ and restarted spark service. After restart spark service, hive-site.xml to original xml file.
Have there any alternative solution to resolve the issue?
Kindly assist me to fix the issue.
Created 01-07-2019 06:17 AM
Hi Vinay,
use the below code to connect hive and list the databases :
spark-shell --conf spark.sql.hive.hiveserver2.jdbc.url="jdbc:hive2://hiveserverip:10000/" spark.datasource.hive.warehouse.load.staging.dir="/tmp" spark.hadoop.hive.zookeeper.quorum="zookeeperquoremip:2181" --jars /usr/hdp/current/hive_warehouse_connector/hive-warehouse-connector-assembly-1.0.0.3.0.0.0-1634.jar
val hive = com.hortonworks.spark.sql.hive.llap.HiveWarehouseBuilder.session(spark).build()
hive.showDatabases().show(100, false)
Reference article
Created 05-07-2019 05:42 AM
try changing "metastore.catalog.default" to "hive" instead of "spark" in spark settings to see all HIVE schemas.
Created 07-25-2019 03:44 PM
Hi
I have follow all the above configurations and finally manage to figure out that the spark.hadoop.metastore.catalog.default set to spark. So if you change this to hive on the command line as listed below its showing all my hive metastore catalog tables.
Thanks
Naga
Created 04-07-2021 03:02 AM
Huge thanks. It works for me.
Created 07-26-2019 08:13 PM
HI , I got bellow error when I develop HWC code on my local , Could you help me to have correct configuration when we work with spark local
Caused by: java.util.NoSuchElementException: spark.sql.hive.hiveserver2.jdbc.url
at org.apache.spark.sql.internal.SQLConf$$anonfun$getConfString$2.apply(SQLConf.scala:1571)
at org.apache.spark.sql.internal.SQLConf$$anonfun$getConfString$2.apply(SQLConf.scala:1571)
Code :
Dependencies which i am using in pom.xml
<dependency> <groupId>com.hortonworks</groupId> <artifactId>spark-llap_2-11</artifactId> <version>1.0.2-2.1</version> </dependency> <!-- https://mvnrepository.com/artifact/com.hortonworks.hive/hive-warehouse-connector --> <dependency> <groupId>com.hortonworks.hive</groupId> <artifactId>hive-warehouse-connector_2.11</artifactId> <version>1.0.0.3.1.2.1-1</version> </dependency>
val sparkConfig = new SparkConf() sparkConfig.set("spark.broadcast.compress", "false") sparkConfig.set("spark.shuffle.compress", "false") sparkConfig.set("spark.shuffle.spill.compress", "false") sparkConfig.set("spark.io.compression.codec", "lzf") sparkConfig.set("spark.sql.catalogImplementation", "hive") sparkConfig.set("hive.exec.dynamic.partition.mode","nonstrict") sparkConfig.set("spark.default.parallelism","1") sparkConfig.set("spark.shuffle.partitions","1") sparkConfig.set("spark.sql.hive.llap", "true") sparkConfig.set("spark.datasource.hive.warehouse.load.staging.dir","/tmp") sparkConfig.set("spark.hadoop.hive.llap.daemon.service.hosts","@llap0") sparkConfig.set("spark.hadoop.hive.zookeeper.quorum ","host1:2181;host2:2181;host3:2181") sparkConfig.set(" spark.hadoop.metastore.catalog.default","hive") val _spark:SparkSession = SparkSession.builder .master("local") .appName("Unit Test") .config(sparkConfig) .enableHiveSupport() .getOrCreate() println("Spark Session Initialized") val hive = HiveWarehouseSession.session(_spark).build() print(hive.showDatabases())