Support Questions
Find answers, ask questions, and share your expertise

How to import table using Sqoop which has Blob column into Hive as Binary column in ORC File Format

New Contributor

I had used the following sqoop command to import db2 data which is defined as BLOB into Hive in ORC format.


sqoop import \--connect "jdbc:db2://******/DB22" --username ***** --password '******' \--table temp --map-column-hive row_id=String,binary_id=Binary 
--map-column-java row_id=String  -m 1 
--hcatalog-database temp_database --hcatalog-table temp_table --create-hcatalog-table --hcatalog-storage-stanza "stored as orcfile"

1) Table is created in Hive (one column is having binary data,rest of the columns are in STRING format) and when I do ,

select count(*) from temp_table; --I am able to get the count

When I select without that one Binary column also ,I am able to pull the result.

2) When I try to select something (including Binary column) from that table like below ,

select * from temp_table limit 1;

I am getting below error:

Error: java.lang.reflect.InvocationTargetException**        at        at        at org.apache.hadoop.hive.shims.HadoopShimsSecure$CombineFileRecordReader.initNextRecordReader(        at org.apache.hadoop.hive.shims.HadoopShimsSecure$CombineFileRecordReader.<init>(

Caused by: java.lang.NegativeArraySizeException**        at        at        at        at


 Could someone help me to resolve this issue?



The application is trying to invoke  an array with a negative size.  for example -1 -2 etc 

 Could you please let me know your Hive ORC related configuration.



New Contributor
Thanks for your reply.

Please find Hive ORC related config below.
; ;