Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

How to load ESCAPE delimited data into Hive

avatar
Expert Contributor

Hi,

I need to load Netezza exported data file into Hive.

The data field is delimited by ESCAPE. I tried the below but it didn't work.

create external table foo ( bar strign ) row format delimited fields terminated by '\01B' stored as textfile.

How can I load ESCAPE delimited data into Hive?

Thanks,

1 ACCEPTED SOLUTION

avatar
Expert Contributor

Thanks @Sunile Manjee

I wrote a simple Pig script to convert the 'escape' character to '\t' and it worked.

raw = load '/tmp/mydata' using PigStorage('\x1B')

store raw into '/tmp/output' using PigStorage('\t')

View solution in original post

4 REPLIES 4

avatar
Master Guru

The ASCII "escape" character (octal: \033, hexadecimal: \x1B, or ^[, or, in decimal, 27) is used in many output devices to start a series of characters called a control sequence or escape sequence.

Besides that how about replacing the escape character with something more familiar like ',' and then load into hive? this can be done with pig or simple sed command.

avatar
Expert Contributor

The data size is pretty big. It would be ideal to load into Hive directly and convert to ORC.

I will try using Pig to convert ESCAPE to something else.

avatar
Master Guru
@yjiang

try the octal repesentation either \073 or \033

avatar
Expert Contributor

Thanks @Sunile Manjee

I wrote a simple Pig script to convert the 'escape' character to '\t' and it worked.

raw = load '/tmp/mydata' using PigStorage('\x1B')

store raw into '/tmp/output' using PigStorage('\t')