Support Questions

Find answers, ask questions, and share your expertise

Hive writing many small csv files to HDFS

avatar

I am exporting Hive table data to csv files in HDFS using such queries

FROM Table T1 INSERT OVERWRITE DIRECTORY '<HDFS Directory>' SELECT *;

Hive is writing many small csv files(1-2MB) to the destination directory.

Is there a way to control the number of files or the size of csv files?

Note:

1) These csv files are not used for creating tables out of them so cannot replace the query with INSERT INTO TABLE...

2) Already tried these setting values to no avail

hive.merge.mapfiles=true;
hive.merge.mapredfiles
hive.merge.smallfiles.avgsize
hive.merge.size.per.task
mapred.max.split.size
mapred.min.split.size;

TIA

I have many tables in Hive with varying size. Some are very large and some are small. I am fine if for large tables many files are generated till each file is larger than 16 MB. I don't want to explicitly set the number of mappers because that will hamper query performance for large tables.

1 ACCEPTED SOLUTION

avatar
@Siddarth Wardhan

If you are using Tez as execution engine, then you need to set below properties:

set hive.merge.tezfiles=true;
set hive.merge.smallfiles.avgsize=128000000;
set hive.merge.size.per.task=128000000;

View solution in original post

2 REPLIES 2

avatar
@Siddarth Wardhan

If you are using Tez as execution engine, then you need to set below properties:

set hive.merge.tezfiles=true;
set hive.merge.smallfiles.avgsize=128000000;
set hive.merge.size.per.task=128000000;

avatar

That works, thank you.