Support Questions
Find answers, ask questions, and share your expertise

Spark unable to connect Hive database in HDP 3.0.1

Solved Go to solution

Re: Spark unable to connect Hive database in HDP 3.0.1

New Contributor

try changing "metastore.catalog.default" to "hive" instead of "spark" in spark settings to see all HIVE schemas.

Re: Spark unable to connect Hive database in HDP 3.0.1

New Contributor

Hi


I have follow all the above configurations and finally manage to figure out that the spark.hadoop.metastore.catalog.default set to spark. So if you change this to hive on the command line as listed below its showing all my hive metastore catalog tables.

  • spark-shell --conf spark.hadoop.metastore.catalog.default=hive

Thanks

Naga

Re: Spark unable to connect Hive database in HDP 3.0.1

New Contributor

Huge thanks. It works for me.

Re: Spark unable to connect Hive database in HDP 3.0.1

New Contributor

HI , I got bellow error when I develop HWC code on my local , Could you help me to have correct configuration when we work with spark local

Caused by: java.util.NoSuchElementException: spark.sql.hive.hiveserver2.jdbc.url

at org.apache.spark.sql.internal.SQLConf$$anonfun$getConfString$2.apply(SQLConf.scala:1571)

at org.apache.spark.sql.internal.SQLConf$$anonfun$getConfString$2.apply(SQLConf.scala:1571)


Code :

Dependencies which i am using in pom.xml

    <dependency>
        <groupId>com.hortonworks</groupId>
        <artifactId>spark-llap_2-11</artifactId>
        <version>1.0.2-2.1</version>
    </dependency>

    <!-- https://mvnrepository.com/artifact/com.hortonworks.hive/hive-warehouse-connector -->
    <dependency>
    <groupId>com.hortonworks.hive</groupId>
    <artifactId>hive-warehouse-connector_2.11</artifactId>
    <version>1.0.0.3.1.2.1-1</version>
</dependency>
val sparkConfig = new SparkConf()

sparkConfig.set("spark.broadcast.compress", "false")
sparkConfig.set("spark.shuffle.compress", "false")
sparkConfig.set("spark.shuffle.spill.compress", "false")
sparkConfig.set("spark.io.compression.codec", "lzf")
sparkConfig.set("spark.sql.catalogImplementation", "hive")
sparkConfig.set("hive.exec.dynamic.partition.mode","nonstrict")
sparkConfig.set("spark.default.parallelism","1")
sparkConfig.set("spark.shuffle.partitions","1")
sparkConfig.set("spark.sql.hive.llap", "true")
sparkConfig.set("spark.datasource.hive.warehouse.load.staging.dir","/tmp")
sparkConfig.set("spark.hadoop.hive.llap.daemon.service.hosts","@llap0")
  sparkConfig.set("spark.hadoop.hive.zookeeper.quorum ","host1:2181;host2:2181;host3:2181")
sparkConfig.set(" spark.hadoop.metastore.catalog.default","hive")

val _spark:SparkSession = SparkSession.builder
  .master("local")
  .appName("Unit Test")
  .config(sparkConfig)
  .enableHiveSupport()
  .getOrCreate()

println("Spark Session Initialized")
val hive = HiveWarehouseSession.session(_spark).build()
print(hive.showDatabases())