Support Questions

Find answers, ask questions, and share your expertise

Phoenix Integer field issue

I am attempting to create and use a Phoenix table on HBase table that was originally created from Hive using HBaseStorageHandler. However, I am getting an error when selecting data from phoenix table.

Hive Table DDL

create table MYTBL(
col1 string, 
col2 int, 
col3 int, 
col4 string ) 
STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' 
("hbase.columns.mapping" = ":key, 

Phoenix Table DDL

"attr"."col2" INTEGER, 
"attr"."col3" INTEGER, 
"attr"."col4" VARCHAR )

Once both the tables are created, I insert the data into Hive table using -

insert into table MYTBL values ("hive", 1, 2, "m");

At this point, the data is available in Hive table and underlying HBase table. HBase table shown below


I can also insert data into Phoenix table and it shows up in underlying HBase table.

upsert into "MYTBL" values ('phoenix', 3, 4, 'm+c');


One thing to note here is how the integer values are being stored for the data inserted through Phoenix.

When I run a select query from Phoenix, it gives an error while parsing the integer field inserted from Hive -> HBase.


Text version of the error below -

p.p1 {margin: 0.0px 0.0px 0.0px 0.0px; font: 11.0px Menlo} p.p2 {margin: 0.0px 0.0px 0.0px 0.0px; font: 11.0px Menlo; color: #c33720} span.s1 {font-variant-ligatures: no-common-ligatures} span.Apple-tab-span {white-space:pre}

0: jdbc:phoenix:> select * from MYTBL;

Error: ERROR 201 (22000): Illegal data. Expected length of at least 4 bytes, but had 2 (state=22000,code=201)

java.sql.SQLException: ERROR 201 (22000): Illegal data. Expected length of at least 4 bytes, but had 2

at org.apache.phoenix.exception.SQLExceptionCode$Factory$1.newException(

at org.apache.phoenix.exception.SQLExceptionInfo.buildException(


at org.apache.phoenix.schema.KeyValueSchema.iterator(

at org.apache.phoenix.schema.KeyValueSchema.iterator(

at org.apache.phoenix.schema.KeyValueSchema.iterator(

at org.apache.phoenix.expression.ProjectedColumnExpression.evaluate(

at org.apache.phoenix.compile.ExpressionProjector.getValue(

at org.apache.phoenix.jdbc.PhoenixResultSet.getString(

at sqlline.Rows$Row.<init>(

at sqlline.BufferedRows.<init>(

at sqlline.SqlLine.print(

at sqlline.Commands.execute(

at sqlline.Commands.sql(

at sqlline.SqlLine.dispatch(

at sqlline.SqlLine.begin(

at sqlline.SqlLine.start(

at sqlline.SqlLine.main(


This is yet-another-form of a common question that is asked on this forum: you should only use Phoenix to create tables that you intend to read using Phoenix.

Phoenix has a very specific type-serialization approach which it uses. The error you are facing is informing you that the data in the cell does not match its expected serialization. This is because the way in which you created the table using the Hive HBaseStorageHandler is not natively compatible with Phoenix. This is not a Phoenix bug.

To accomplish accessibility between Hive and Phoenix, use the PhoenixStorageHandler which is available in HDP 2.5 and beyond.


Hi Josh,

In the phoenix datatype description ( link ), its mentioned that Phoenix Unsigned data types map to Hbase Bytes.toBytes method . Is there a way to utilize these unsigned data types to map existing Hbase data to Phoenix tables and be able to read the data correctly from Phoenix. I mapped numbers inserted through Hbase Shell to Unsigned_int datatype in phoenix but i was still getting same error that bsaini was getting in the above question. Could you please clarify if we can use Unsigned_Int in the above scenario.