Support Questions
Find answers, ask questions, and share your expertise

Getting an error while selecting Binary data ORC files?



I had used the following sqoop command to import db2 data which is defined as BLOB into Hive in ORC format.

sqoop import \ --connect "jdbc:db2://******/DB22" --username ***** --password '******' \ --table temp --map-column-hive row_id=String,binary_id=Binary --map-column-java row_id=String -m 1 --hcatalog-database temp_database --hcatalog-table temp_table --create-hcatalog-table --hcatalog-storage-stanza "stored as orcfile"

1) Table is created in Hive (one column is having binary data,rest of the columns are in STRING format) and

when I do , select count(*) from temp_table; --

I am able to get the count When I select without that one Binary column also ,I am able to pull the result.

But, 2) When I try to select something (including Binary column) from that table like below.

select * from temp_table limit 1; I am getting below error:,

Please help.



What is the output of command below and why do you map row_id to hive and java, use one or the other

select row_id, base64(binary_id) limit 1;


@Artem Ervits Thanks for your reply. I tried it but no results and no errors.

@Sankar T

can you take a look at the class generated after your sqoop and verify the data type for binary_id?

; ;