Member since
07-30-2018
6
Posts
1
Kudos Received
0
Solutions
11-21-2018
11:58 PM
1 Kudo
save_table_hwc(df1, "default", "table_test1")
def save_table_hwc(df: DataFrame, database: String, tableName: String) : Unit = {
hive.setDatabase(database)
hive.dropTable(tableName, true, false)
hive.createTable(tableName)
var table_builder = hive.createTable(tableName)
for( i <- 0 to df.schema.length-1){
var name = df.schema.toList(i).name.replaceAll("[^\\p{L}\\p{Nd}]+", "")
var data_type = df.schema.toList(i).dataType.sql
table_builder = table_builder.column(name, data_type)
}
table_builder.create()
df.write.format(HIVE_WAREHOUSE_CONNECTOR).option("table", tableName).save()
}
... View more
04-09-2019
02:44 PM
Hi guys, i followed the above steps, and was able to execute commands like ( show databases, show tables) successfully, also created a database from spark-shell and created a table and inserted some data in it, but i am not able to query the data either from the newly created table from spark, nor the tables that already exists in hive, and getting this error java.lang.AbstractMethodError: Method com/hortonworks/spark/sql/hive/llap/HiveWarehouseDataSourceReader.createBatchDataReaderFactories()Ljava/util/List; is abstract at com.hortonworks.spark.sql.hive.llap.HiveWarehouseDataSourceReader.createBatchDataReaderFactories(HiveWarehouseDataSourceReader.java) the commands is as below: import com.hortonworks.hwc.HiveWarehouseSession val hive = HiveWarehouseSession.session(spark).build() hive.createTable("hwx_table").column("value", "string").create() hive.executeUpdate("insert into hwx_table values('1')") hive.executeQuery("select * from hwx_table").show then the error appears, i am using the below command to start spark-shell spark-shell --master yarn --jars /usr/hdp/current/hive-warehouse-connector/hive-warehouse-connector_2.11-1.0.0.3.1.2.0-4.jar --conf spark.security.credentials.hiveserver2.enabled=false
... View more