- Subscribe to RSS Feed
- Mark Question as New
- Mark Question as Read
- Float this Question for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page
Unable to access Avro object in HBase from Hive
- Labels:
-
Apache HBase
-
Apache Hive
Created ‎05-06-2016 11:10 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I have a number of simple Avro objects stored in HBase and am trying to access them from Hive. I've set up a Hive table by following the instructions that I found here.
DROP TABLE IF EXISTS HBaseAvro;
CREATE EXTERNAL TABLE HBaseAvro
ROW FORMAT SERDE 'org.apache.hadoop.hive.hbase.HBaseSerDe'
STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler'
WITH SERDEPROPERTIES ("hbase.columns.mapping" = ":key,event:pCol", "event.pCol.serialization.type" = "avro", "event.pCol.avro.schema.url" = "hdfs:///tmp/kafka/avro/avro.avsc")
TBLPROPERTIES("hbase.table.name" = "avro", "hbase.mapred.output.outputtable" = "avro", "hbase.struct.autogenerate" = "true");
If the Avro object contains the schema in the header, I have no problem and can access the data. However if the Avro object DOes NOT contain the schema then when I try and access the Avro object I get an IO Exception:
{"message":"H170 Unable to fetch results. java.io.IOException: org.apache.hadoop.hive.ql.metadata.HiveException: Error evaluating event_pcol...
If I do a DESCRIBE on the Hive table, I can see the table correctly, in that event_pcol is shown as a structure, with the correct fields.
I've tried moving the avsc file to check that the CREATE TABLE is working and Hive correctly complains. With the CREATE as above the table appears to be created correctly and I can access the "key" values, so the problem appears to be with the Avro object.
To me it looks like Hive is not using the schema definition passed in the schema.url parameter. I've tried including the schema as a schema.literal parameter and it still fails.
Any ideas?
Created ‎05-13-2016 12:39 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi @mark doutre, I checked your files (2 days ago, but couldn't post sooner), and my conclusion is that Hive cannot handle Avro files without schema. From AvroSerDe page you can see which Avro versions are supported (1.5.3 to 1.7.5), and Avro specs say: Avro data is always serialized with its schema. Files that store Avro data should always also include the schema for that data in the same file. And it has been so from version 1. So, it's very clear that "standard" Avro files must include schema and Hive supports only such files. With schema-less files you are on your own, you would have to read "value" from HBase and apply your schema to read the data and store such records in Hive. You can also include schema, which will work, but you will waste some space in HBase by storing the same schema in each record. Hope this helps.
Created ‎05-10-2016 03:12 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
@Predrag Minovic avro and associated schema.
Created ‎05-10-2016 03:16 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Associated Hive code. Avro files are stored in /user/hue/testdata/avro_data/avro.avro etc
DROP TABLE IF EXISTS avro_test; CREATE EXTERNAL TABLE avro_test COMMENT "A table backed by Avro data with the Avro schema stored in HDFS" ROW FORMAT SERDE 'org.apache.hadoop.hive.serde2.avro.AvroSerDe' STORED AS INPUTFORMAT 'org.apache.hadoop.hive.ql.io.avro.AvroContainerInputFormat' OUTPUTFORMAT 'org.apache.hadoop.hive.ql.io.avro.AvroContainerOutputFormat' LOCATION 'hdfs:///user/hue/testdata/avro_data' TBLPROPERTIES ( 'avro.schema.url'='hdfs:///user/hue/testdata/avro.avsc' );
Created ‎05-13-2016 12:39 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi @mark doutre, I checked your files (2 days ago, but couldn't post sooner), and my conclusion is that Hive cannot handle Avro files without schema. From AvroSerDe page you can see which Avro versions are supported (1.5.3 to 1.7.5), and Avro specs say: Avro data is always serialized with its schema. Files that store Avro data should always also include the schema for that data in the same file. And it has been so from version 1. So, it's very clear that "standard" Avro files must include schema and Hive supports only such files. With schema-less files you are on your own, you would have to read "value" from HBase and apply your schema to read the data and store such records in Hive. You can also include schema, which will work, but you will waste some space in HBase by storing the same schema in each record. Hope this helps.
Created ‎05-13-2016 02:54 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Thanks for taking the time to look in to this. I had sort of come to the same conclusion but all the info I had seen online seemed to suggest that Hive could access a schema-less Avro object provided that the schema was included via the TBLPROPERTIES avro.schema.url parameter.
Created ‎05-19-2016 04:10 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
@mark doutre I've just found a new blog post talking about your use-case of storing Avro schema-less objects in HBase. It's implemented by direct interaction with HBase, without Hive. The code appears to be simple. HTH
Created ‎08-22-2018 08:47 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Any Update on this?
i am running into same exception, do we need to write Avro record with schema?
