- Subscribe to RSS Feed
- Mark Question as New
- Mark Question as Read
- Float this Question for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page
Access Hbase Increment column via hive/impala
- Labels:
-
Apache HBase
Created on 05-24-2016 01:26 PM - edited 09-16-2022 03:21 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hbase has a nice feature called counter increment, where you can atomicly increment a value and get back the result. I want to create ta simple external table over this hbase table, but I dont know how to choose the correct data type for Hive/Impala.
The value of colfam:FL_Lock is this:
20160512_000006 column=colfam:FL_Lock, timestamp=1464120550634, value=\x00\x00\x00\x00\x00\x00\x00\x00
If I create external table with string, the query returns nothing, no error
If I create external table with bigint/decimal/int, the query returns NULL and ERROR from Impala:
Error converting column colfam:FL_Lock: '' TO INT
Any ideas how to map correctly this Hbase column?
Thanks
Tomas
Created 06-10-2016 05:44 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Actually this has been already resolved, we changed the create table statetment, added #b (hash b - as binary).
create external table md_extract_file_status ( table_key string, fl_counter bigint )
STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH SERDEPROPERTIES ('hbase.columns.mapping' = ':key,colfam:FL_Counter#b ) TBLPROPERTIES('hbase.table.name' ='HBTABLE');
Created 06-10-2016 02:19 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
You can use the HBase API of http://archive.cloudera.com/cdh5/cdh/5/hbase/apidocs/org/apache/hadoop/hbase/util/Bytes.html#toLong(... to perform the transformation within the UDF.
X-Ref: https://github.com/cloudera/hbase/blob/cdh5.7.1-release/hbase-server/src/main/java/org/apache/hadoop...
Created 06-10-2016 05:44 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Actually this has been already resolved, we changed the create table statetment, added #b (hash b - as binary).
create external table md_extract_file_status ( table_key string, fl_counter bigint )
STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH SERDEPROPERTIES ('hbase.columns.mapping' = ':key,colfam:FL_Counter#b ) TBLPROPERTIES('hbase.table.name' ='HBTABLE');
