Archives of Support Questions (Read Only)

This is an archived board for historical reference. Information and links may no longer be available or relevant
Announcements
This board is archived and read-only for historical reference. To ask a new question, please post a new topic on the appropriate active board.

SQOOP issues with ORACLE datatypes - RAW, BFILE, LONG RAW

avatar
Expert Contributor

Example oracle RAW datatype value “175D78D86FAFE6D19C631AF3BFC246EB” and the sqoop converts that data into string. The data has spaces when I check the hdfs file using "hive –orcdump –d <hdfsfile>”. The RAW dataype is expressed in hexadecimal on oracle side and sqoop adds space to the hdfs file like “17 5D 78 D8 6F AF E6 D1 9C 63 1A F3 BF C2 46 EB”. Has anyone faced this ?

RAW datatype is supported by sqoop in the documentation - https://sqoop.apache.org/docs/1.4.6/SqoopUserGuide.html

I see BFILE and LONG RAW datatypes are not supported in sqoop documentation. Do we have a workaround for this ?

1 ACCEPTED SOLUTION

avatar
Super Guru
4 REPLIES 4

avatar
Super Guru

avatar
Expert Contributor

Ya I tried it and that doesn't help.

avatar
Expert Contributor

This has been accepted as a bug by the support team and tracked in the internal JIRA.

avatar

Hi balaji what is the status of this Jira. did you have any referance to that jira I can refer it ?