Support Questions
Find answers, ask questions, and share your expertise
Announcements
Check out our newest addition to the community, the Cloudera Innovation Accelerator group hub.

I am new to hive. how to delete Hive default folder tables, I have created drivers, driver 1, etc. and want to clean it

New Contributor

I have created drivers, driver 1 tables when trying to create temp tables, etc. and now there is a list of it and want to clean it

8 REPLIES 8

Super Guru
@sherri cheng

do you mean you have created tables called "drivers, driver1 etc" and now you want to get rid of tables and their associated data? Are the folders created under "/usr/hive/warehouse" directory? Have you used the following?

DROP TABLE [IF EXISTS] table_name [PURGE]; --> DROP TABLE IF EXISTS drivers PURGE;

Then run it again for other tables.

Hi @mqureshi

What might be the reason when a hive managed table is dropped but the HDFS file is not removed? Thanks in advance.

Super Guru

@Bala Vignesh N V

If your table is not a hive managed table (data under hive warehouse directory) or in other words when you create an external table, then dropping a table does not delete data. Data is deleted on drop table only for Hive Managed tables.

Thanks @mqureshi.

I completely understand the concept of external table . But what i was looking for is, What are the reasons when a 'Managed Hive table' is dropped but the data underneath hive warehouse directory still exists.

Super Guru

@Bala Vignesh N V

Then its likely permission issue. Check permissions on .Trash folder and possible Ranger permissions for the user who is running DROP Table.

Thanks @mqureshi

@sherri cheng

You mean do you want to delete the folders on which the hive table is created?

If its a managed table then dropping the hive table will delete the folders underneath the warehouse.

But if it is a external table then you have to manual delete the folders/ files underneath.

Expert Contributor

Are these tables External Tables? In the case of external tables you would have manually clean the folders by removing the files and folders that are referenced by the table ( using hadoop fs -rm command)