Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

Cannot read data using Spark - Hive Warehouse Connector

avatar
Expert Contributor

Hi guys,


i am using spark-shell ( spark-shell --master yarn --jars /usr/hdp/current/hive-warehouse-connector/hive-warehouse-connector_2.11-1.0.0.3.1.2.0-4.jar --conf spark.security.credentials.hiveserver2.enabled=false)

to read hive tables using spark, i am able to execute commands like create table, create database, show tables, show databases, but i am not able to read data from tables,


my code is as below

import com.hortonworks.hwc.HiveWarehouseSession
val hive = HiveWarehouseSession.session(spark).build()
hive.createDatabase("spark_llap01",false)
hive.setDatabase("spark_llap01")
hive.createTable("hwx_table").column("value", "string").create()
hive.executeUpdate("insert into hwx_table values('1')")

hive.executeQuery("select * from hwx_table").show


i get this error, whenever i try to fetch data

java.lang.AbstractMethodError: Method com/hortonworks/spark/sql/hive/llap/HiveWarehouseDataSourceReader.createBatchDataReaderFactories()Ljava/util/List; is abstract


i used beeline to check whether the data has been written, and i found the database exists along with the table, also when i queried the table i found the data.


1 ACCEPTED SOLUTION

avatar
Expert Contributor

had to use another version of the connector hive-warehouse-connector-assembly-1.0.0.3.0.1.0-187.jar




View solution in original post

2 REPLIES 2

avatar
Expert Contributor

had to use another version of the connector hive-warehouse-connector-assembly-1.0.0.3.0.1.0-187.jar




avatar
Explorer

where can i get that jar ?