Created on 05-05-2017 05:09 AM - edited 09-16-2022 04:33 AM
Hi All,
I am seeing a strange issue with Hortonworks hive, the below HQL statement is producing a file with junk characters. Do you think I have missed something? Same query is producing readable data in Cloudera cluster. Can someone please suggest.
INSERT OVERWRITE LOCAL DIRECTORY '<Local Linux Directory>' ROW FORMAT DELIMITED FIELDS TERMINATED BY ',' STORED AS TEXTFILE SELECT col1,col2 FROM table1;
Created 05-05-2017 06:19 AM
Hi @Vinay R
What is the format that table1 in hive? Is it stored in any other format apart from TEXTFILE? If so the data might be compressed if you have enabled the compression in your table property.
Execute this and try exporting it. set hive.exec.compress.output=false;
Created 05-05-2017 06:19 AM
Hi @Vinay R
What is the format that table1 in hive? Is it stored in any other format apart from TEXTFILE? If so the data might be compressed if you have enabled the compression in your table property.
Execute this and try exporting it. set hive.exec.compress.output=false;
Created 05-05-2017 06:58 AM
@Bala Vignesh N V Thank you very much , the table I am querying was compressed and the above suggestion did help.
Created 05-05-2017 07:06 AM
@Vinay R Glad it helped you. If you think it solves your problem then please accept the answer.