Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

Unable to access Avro object in HBase from Hive

avatar
Rising Star

I have a number of simple Avro objects stored in HBase and am trying to access them from Hive. I've set up a Hive table by following the instructions that I found here.

Basically in Hive I do:

DROP TABLE IF EXISTS HBaseAvro;

CREATE EXTERNAL TABLE HBaseAvro

ROW FORMAT SERDE 'org.apache.hadoop.hive.hbase.HBaseSerDe'

STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler'

WITH SERDEPROPERTIES ("hbase.columns.mapping" = ":key,event:pCol", "event.pCol.serialization.type" = "avro", "event.pCol.avro.schema.url" = "hdfs:///tmp/kafka/avro/avro.avsc")

TBLPROPERTIES("hbase.table.name" = "avro", "hbase.mapred.output.outputtable" = "avro", "hbase.struct.autogenerate" = "true");

If the Avro object contains the schema in the header, I have no problem and can access the data. However if the Avro object DOes NOT contain the schema then when I try and access the Avro object I get an IO Exception:

{"message":"H170 Unable to fetch results. java.io.IOException: org.apache.hadoop.hive.ql.metadata.HiveException: Error evaluating event_pcol...

If I do a DESCRIBE on the Hive table, I can see the table correctly, in that event_pcol is shown as a structure, with the correct fields.

I've tried moving the avsc file to check that the CREATE TABLE is working and Hive correctly complains. With the CREATE as above the table appears to be created correctly and I can access the "key" values, so the problem appears to be with the Avro object.

To me it looks like Hive is not using the schema definition passed in the schema.url parameter. I've tried including the schema as a schema.literal parameter and it still fails.

Any ideas?

1 ACCEPTED SOLUTION

avatar
Master Guru

Hi @mark doutre, I checked your files (2 days ago, but couldn't post sooner), and my conclusion is that Hive cannot handle Avro files without schema. From AvroSerDe page you can see which Avro versions are supported (1.5.3 to 1.7.5), and Avro specs say: Avro data is always serialized with its schema. Files that store Avro data should always also include the schema for that data in the same file. And it has been so from version 1. So, it's very clear that "standard" Avro files must include schema and Hive supports only such files. With schema-less files you are on your own, you would have to read "value" from HBase and apply your schema to read the data and store such records in Hive. You can also include schema, which will work, but you will waste some space in HBase by storing the same schema in each record. Hope this helps.

View solution in original post

6 REPLIES 6

avatar
Rising Star

@Predrag Minovic avro and associated schema.

avrobug.zip

avatar
Rising Star

Associated Hive code. Avro files are stored in /user/hue/testdata/avro_data/avro.avro etc

DROP TABLE IF EXISTS avro_test;
CREATE EXTERNAL TABLE avro_test
    COMMENT "A table backed by Avro data with the Avro schema stored in HDFS"
    ROW FORMAT SERDE 'org.apache.hadoop.hive.serde2.avro.AvroSerDe'
    STORED AS
    INPUTFORMAT  'org.apache.hadoop.hive.ql.io.avro.AvroContainerInputFormat'
    OUTPUTFORMAT 'org.apache.hadoop.hive.ql.io.avro.AvroContainerOutputFormat'
    LOCATION 'hdfs:///user/hue/testdata/avro_data'
    TBLPROPERTIES (
        'avro.schema.url'='hdfs:///user/hue/testdata/avro.avsc'
    );

avatar
Master Guru

Hi @mark doutre, I checked your files (2 days ago, but couldn't post sooner), and my conclusion is that Hive cannot handle Avro files without schema. From AvroSerDe page you can see which Avro versions are supported (1.5.3 to 1.7.5), and Avro specs say: Avro data is always serialized with its schema. Files that store Avro data should always also include the schema for that data in the same file. And it has been so from version 1. So, it's very clear that "standard" Avro files must include schema and Hive supports only such files. With schema-less files you are on your own, you would have to read "value" from HBase and apply your schema to read the data and store such records in Hive. You can also include schema, which will work, but you will waste some space in HBase by storing the same schema in each record. Hope this helps.

avatar
Rising Star

Hi @Predrag Minovic

Thanks for taking the time to look in to this. I had sort of come to the same conclusion but all the info I had seen online seemed to suggest that Hive could access a schema-less Avro object provided that the schema was included via the TBLPROPERTIES avro.schema.url parameter.

avatar
Master Guru

@mark doutre I've just found a new blog post talking about your use-case of storing Avro schema-less objects in HBase. It's implemented by direct interaction with HBase, without Hive. The code appears to be simple. HTH

avatar
New Contributor

Any Update on this?

i am running into same exception, do we need to write Avro record with schema?

@mark doutre @Predrag Minovic