Member since
02-09-2017
2
Posts
0
Kudos Received
0
Solutions
02-12-2017
04:22 PM
Thanks for your reply. Please find Hive ORC related config below. hive.exec.orc.skip.corrupt.data=false hive.exec.orc.default.row.index.stride=10000
... View more
02-09-2017
11:59 AM
I had used the following sqoop command to import db2 data which is defined as BLOB into Hive in ORC format. sqoop import \ -- connect "jdbc:db2://******/DB22" -- username ***** -- password '******' \ -- table temp -- map - column - hive row_id = String , binary_id = Binary
-- map - column - java row_id = String - m 1
-- hcatalog - database temp_database -- hcatalog - table temp_table -- create - hcatalog - table -- hcatalog - storage - stanza "stored as orcfile" 1) Table is created in Hive (one column is having binary data,rest of the columns are in STRING format) and when I do , select count (*) from temp_table ; -- I am able to get the count When I select without that one Binary column also ,I am able to pull the result. But, 2) When I try to select something (including Binary column) from that table like below , select * from temp_table limit 1 ; I am getting below error: Error : java . io . IOException : java . lang . reflect . InvocationTargetException ** at org . apache . hadoop . hive . io . HiveIOExceptionHandlerChain . handleRecordReaderCreationException ( HiveIOExceptionHandlerChain . java : 97 ) at org . apache . hadoop . hive . io . HiveIOExceptionHandlerUtil . handleRecordReaderCreationException ( HiveIOExceptionHandlerUtil . java : 57 ) at org . apache . hadoop . hive . shims . HadoopShimsSecure$CombineFileRecordReader . initNextRecordReader ( HadoopShimsSecure . java : 266 ) at org . apache . hadoop . hive . shims . HadoopShimsSecure$CombineFileRecordReader .< init >( HadoopShimsSecure . java : 213 )
Caused by : java . lang . NegativeArraySizeException ** at org . apache . hadoop . hive . ql . io . orc . RecordReaderUtils . readDiskRanges ( RecordReaderUtils . java : 271 ) at org . apache . hadoop . hive . ql . io . orc . RecordReaderImpl . readPartialDataStreams ( RecordReaderImpl . java : 991 ) at org . apache . hadoop . hive . ql . io . orc . RecordReaderImpl . readStripe ( RecordReaderImpl . java : 819 ) at org . apache . hadoop . hive . ql . io . orc . RecordReaderImpl . advanceStripe ( RecordReaderImpl . java : 1013 ) Could someone help me to resolve this issue?
... View more
Labels:
- Labels:
-
Hadoop Concepts
-
HDFS
-
Hive
-
Sqoop