Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

Hive Insert overwrite directory producing junk characters.

Solved Go to solution

Hive Insert overwrite directory producing junk characters.

Hi All,

I am seeing a strange issue with Hortonworks hive, the below HQL statement is producing a file with junk characters. Do you think I have missed something? Same query is producing readable data in Cloudera cluster. Can someone please suggest.

INSERT OVERWRITE LOCAL DIRECTORY '<Local Linux Directory>' ROW FORMAT DELIMITED FIELDS TERMINATED BY ',' STORED AS TEXTFILE SELECT col1,col2 FROM table1;

1 ACCEPTED SOLUTION

Accepted Solutions
Highlighted

Re: Hive Insert overwrite directory producing junk characters.

Hi @Vinay R

What is the format that table1 in hive? Is it stored in any other format apart from TEXTFILE? If so the data might be compressed if you have enabled the compression in your table property.

Execute this and try exporting it. set hive.exec.compress.output=false;

View solution in original post

3 REPLIES 3
Highlighted

Re: Hive Insert overwrite directory producing junk characters.

Hi @Vinay R

What is the format that table1 in hive? Is it stored in any other format apart from TEXTFILE? If so the data might be compressed if you have enabled the compression in your table property.

Execute this and try exporting it. set hive.exec.compress.output=false;

View solution in original post

Highlighted

Re: Hive Insert overwrite directory producing junk characters.

@Bala Vignesh N V Thank you very much , the table I am querying was compressed and the above suggestion did help.

Highlighted

Re: Hive Insert overwrite directory producing junk characters.

@Vinay R Glad it helped you. If you think it solves your problem then please accept the answer.

Don't have an account?
Coming from Hortonworks? Activate your account here