Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

Hive table with UTF-16 data

Solved Go to solution

Hive table with UTF-16 data

New Contributor

One of my client is trying to create an external Hive table in HDP from CSV files, (about 30 files, total of 2.5 TeraBytes)

But the files are formatted as: “Little-endian, UTF-16 Unicode text, with CRLF, CR line terminators”. Here are couple of issues

Is there an easy way to convert CSV/TXT files from Unicode (UTF-16 / UCS-2) to ASCII (UTF-8)?

Is there is a way for Hive to recognize this format?

He tried to use iconv to convert the utf-16 format to ascii format but it but it fails when source file is more than 15 GB file.

iconv -c -f utf-16 -t us-ascii

Any suggestions??

1 ACCEPTED SOLUTION

Accepted Solutions

Re: Hive table with UTF-16 data

New Contributor

Here are some solution options i received from Ryan Merriman, Benjamin Leonhardi & Peter Coates

Option1

You can use split –l to break the bigger file into small one while using iconv

Option2

I suppose it would be a good idea to write a little program using icu if iconv fails.

http://userguide.icu-project.org/conversion/converters

Option3

You can try to do it in Java. Here’s one example:

https://docs.oracle.com/javase/tutorial/i18n/text/stream.html

You can try using File(Input|Output)Stream and String classes. You can specify character encoding when reading (converting byte[] to String):

String s = String(byte[] bytes, Charset charset)

And when writing it back out (String to byte[]):

s.getBytes(Charset charset)

This approach should solve your size limit problem.

4 REPLIES 4

Re: Hive table with UTF-16 data

New Contributor

Here are some solution options i received from Ryan Merriman, Benjamin Leonhardi & Peter Coates

Option1

You can use split –l to break the bigger file into small one while using iconv

Option2

I suppose it would be a good idea to write a little program using icu if iconv fails.

http://userguide.icu-project.org/conversion/converters

Option3

You can try to do it in Java. Here’s one example:

https://docs.oracle.com/javase/tutorial/i18n/text/stream.html

You can try using File(Input|Output)Stream and String classes. You can specify character encoding when reading (converting byte[] to String):

String s = String(byte[] bytes, Charset charset)

And when writing it back out (String to byte[]):

s.getBytes(Charset charset)

This approach should solve your size limit problem.

Highlighted

Re: Hive table with UTF-16 data

New Contributor

I used NiFi's ConvertCharacterSet to change from UTF-16LE to UTF-8, it's a great and straightforward option if you're using it :)

Re: Hive table with UTF-16 data

Expert Contributor

Hi, where i can find the character set values that are accepted by ConvertCharacterSet processor?

Also what component can i use to load CSV file and to dump results into the converted CSV file?

Re: Hive table with UTF-16 data

Expert Contributor

So i found appropriate components but it doesnt convert the file properly, any idea? input file is a binary