Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

Hive Insert overwrite directory producing junk characters.

avatar
Contributor

Hi All,

I am seeing a strange issue with Hortonworks hive, the below HQL statement is producing a file with junk characters. Do you think I have missed something? Same query is producing readable data in Cloudera cluster. Can someone please suggest.

INSERT OVERWRITE LOCAL DIRECTORY '<Local Linux Directory>' ROW FORMAT DELIMITED FIELDS TERMINATED BY ',' STORED AS TEXTFILE SELECT col1,col2 FROM table1;

1 ACCEPTED SOLUTION

avatar

Hi @Vinay R

What is the format that table1 in hive? Is it stored in any other format apart from TEXTFILE? If so the data might be compressed if you have enabled the compression in your table property.

Execute this and try exporting it. set hive.exec.compress.output=false;

View solution in original post

3 REPLIES 3

avatar

Hi @Vinay R

What is the format that table1 in hive? Is it stored in any other format apart from TEXTFILE? If so the data might be compressed if you have enabled the compression in your table property.

Execute this and try exporting it. set hive.exec.compress.output=false;

avatar
Contributor

@Bala Vignesh N V Thank you very much , the table I am querying was compressed and the above suggestion did help.

avatar

@Vinay R Glad it helped you. If you think it solves your problem then please accept the answer.