@Pedro Rodgers
If schema type is same on all the 100 text files then better to create a hive external table since you already have those files on HDFS.
Example: If you have all the files under "/user/test/dummy/data" directory than run below command to create the external hive table and point it to the hdfs location.
CREATE EXTERNAL TABLE user(
userId BIGINT,
type INT,
level TINYINT,
date String
)
COMMENT 'User Infomation'
PARTITIONED BY (date String)
LOCATION '/user/test/dummy/data';
Then, create the folder date=2011-11-11
inside /user/test/dummy/data/
And put the data files of date 2011-11-11 into the folder, Once you done you also need to add the partition in the hive metastore.
ALTER TABLE user ADD PARTITION(date='2011-11-11');