Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

Spark unable to connect Hive database in HDP 3.0.1

Solved Go to solution
Highlighted

Re: Spark unable to connect Hive database in HDP 3.0.1

New Contributor

HI , I got bellow error when I develop HWC code on my local , Could you help me to have correct configuration when we work with spark local

Caused by: java.util.NoSuchElementException: spark.sql.hive.hiveserver2.jdbc.url

at org.apache.spark.sql.internal.SQLConf$$anonfun$getConfString$2.apply(SQLConf.scala:1571)

at org.apache.spark.sql.internal.SQLConf$$anonfun$getConfString$2.apply(SQLConf.scala:1571)


Code :

Dependencies which i am using in pom.xml

    <dependency>
        <groupId>com.hortonworks</groupId>
        <artifactId>spark-llap_2-11</artifactId>
        <version>1.0.2-2.1</version>
    </dependency>

    <!-- https://mvnrepository.com/artifact/com.hortonworks.hive/hive-warehouse-connector -->
    <dependency>
    <groupId>com.hortonworks.hive</groupId>
    <artifactId>hive-warehouse-connector_2.11</artifactId>
    <version>1.0.0.3.1.2.1-1</version>
</dependency>
val sparkConfig = new SparkConf()

sparkConfig.set("spark.broadcast.compress", "false")
sparkConfig.set("spark.shuffle.compress", "false")
sparkConfig.set("spark.shuffle.spill.compress", "false")
sparkConfig.set("spark.io.compression.codec", "lzf")
sparkConfig.set("spark.sql.catalogImplementation", "hive")
sparkConfig.set("hive.exec.dynamic.partition.mode","nonstrict")
sparkConfig.set("spark.default.parallelism","1")
sparkConfig.set("spark.shuffle.partitions","1")
sparkConfig.set("spark.sql.hive.llap", "true")
sparkConfig.set("spark.datasource.hive.warehouse.load.staging.dir","/tmp")
sparkConfig.set("spark.hadoop.hive.llap.daemon.service.hosts","@llap0")
  sparkConfig.set("spark.hadoop.hive.zookeeper.quorum ","host1:2181;host2:2181;host3:2181")
sparkConfig.set(" spark.hadoop.metastore.catalog.default","hive")

val _spark:SparkSession = SparkSession.builder
  .master("local")
  .appName("Unit Test")
  .config(sparkConfig)
  .enableHiveSupport()
  .getOrCreate()

println("Spark Session Initialized")
val hive = HiveWarehouseSession.session(_spark).build()
print(hive.showDatabases())
Don't have an account?
Coming from Hortonworks? Activate your account here