Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

Number of column limitations in hive over hbase tables ?

avatar
Contributor

Hi,

I have been trying create bunch of Hive over Hbase tables. It creates the table both on Hive & on Hbase side. But when I try to query that hive table get the following error when the number of columns reach 200 ish mark...I verified the count of columns on the DDL of Hive over Hbase matches, there are no special characters or any of that sort. Works fine for the same tables with smaller set of columns (< 200ish).

Not sure, are we missing something in the configuration.. We are on HDP2.3.2. Appreciate any sort of suggestions to debug this issue.

hive> select * from CLARITY_RT.PATIENT_TRIAL;

FAILED: RuntimeException MetaException(message:org.apache.hadoop.hive.serde2.SerDeException org.apache.hadoop.hive.hbase.HBaseSerDe: columns has 248 elements while hbase.columns.mapping has 207 elements (counting the key if implicit))

hive> select * from CLARITY_RT.ORDER_MED_TRIAL;

FAILED: RuntimeException MetaException(message:org.apache.hadoop.hive.serde2.SerDeException org.apache.hadoop.hive.hbase.HBaseSerDe: columns has 205 elements while hbase.columns.mapping has 198 elements (counting the key if implicit))

hive> select * from CLARITY_RT.DM_CF;

FAILED: RuntimeException MetaException(message:org.apache.hadoop.hive.serde2.SerDeException org.apache.hadoop.hive.hbase.HBaseSerDe: columns has 544 elements while hbase.columns.mapping has 225 elements (counting the key if implicit))

1 ACCEPTED SOLUTION

avatar
Contributor

Hi,

This has fixed the issue.

https://hortonworks.my.salesforce.com/kA2E0000000LZQ5?srPos=0&srKp=ka2〈=en_US

ROOT CAUSE: A character limitation for PARAM_VALUE field in SERDE_PARAMS table in hive metastore for 4000 character is the root cause of this issue. This limitation prevents Hive from creating a table with high column numbers, eventually causing desc <table name> or select * from <table name> to fail with error above.

WORKAROUND: This issue can be worked around by doing the following in hive metastore

-- log into Hive Metastore DB -- >alter table SERDE_PARAMS MODIFY PARAM_VALUE VARCHAR(400000000);

Thanks.

-Datta

View solution in original post

4 REPLIES 4

avatar
Master Guru

The error says that you have declared m columns in your Hive table, and n cf:column mappings in your hbase.columns.mapping string and that m!=n. Can you check which of these numbers is wrong? As I'm sure you know, you can declare Hive columns in a free text block using many lines and spaces but the hbase.columns.mapping string is very restrictive, it allows no "beautifying" spaces, only the key and cf:column parts separated by commas. Hive HBase Integration page doesn't mention any limit on the length of the string, though it admits that the string is somewhat cumbersome and restrictive. Alternatively, you can try to map columns from a HBase table column family "cf" using the ":key,cf:" string. They will map into a Hive map<...> element composed map(column,value) for each key, see an example here. You can then keep on working with the map, or explode it using Hive's explode(map) UDF.

avatar
Contributor

Hi @Predrag Minovic,Thanks for your response. I have same number of columns both on the Hive & hbase side in my ddl.& dont see any whitespaces. I am also attaching the 2 ddls.

(1) working_ver, that works with column count up to 207. After executing this script , it looks fine with Hive or hbase table.

hive> select * from PATIENT_NRICHED; OK Time taken: 0.447 seconds

(2) not_working_ver, I added a new column called "xyz string" on hive side & "v:xyz" on the hbase side. After executing the script, It creates the table. When you run the query "select * from CLARITY_RT.PATIENT_NRICHED", will get the following error.

hive> select * from PATIENT_NRICHED; FAILED: RuntimeException MetaException(message:org.apache.hadoop.hive.serde2.SerDeException org.apache.hadoop.hive.hbase.HBaseSerDe: columns has 208 elements while hbase.columns.mapping has 207 elements (counting the key if implicit)).

Thanks

working-ver.txt

not-working-ver.txt

Datta

avatar
Contributor

Hi,

This has fixed the issue.

https://hortonworks.my.salesforce.com/kA2E0000000LZQ5?srPos=0&srKp=ka2〈=en_US

ROOT CAUSE: A character limitation for PARAM_VALUE field in SERDE_PARAMS table in hive metastore for 4000 character is the root cause of this issue. This limitation prevents Hive from creating a table with high column numbers, eventually causing desc <table name> or select * from <table name> to fail with error above.

WORKAROUND: This issue can be worked around by doing the following in hive metastore

-- log into Hive Metastore DB -- >alter table SERDE_PARAMS MODIFY PARAM_VALUE VARCHAR(400000000);

Thanks.

-Datta

avatar
Contributor

Many Thanks for sharing this! it worked for me as well but I am not sure if this is the correct way of fixing it, or is it a only work around only? I need to put a fix in the production env for the same.