New Contributor
Posts: 2
Registered: ‎09-28-2017

hadoop/ hive size


Hive has one database which has 100 tables. How do I find the size of those 100 tables using a particular script? I know that (hdfs dfs -du -s -h /path/to/table/name) is used for fetching size for 1 table. What if i need to fetch the size of all the 100 tables at one single go(one single script)?

Posts: 505
Topics: 14
Kudos: 84
Solutions: 45
Registered: ‎09-02-2016

Re: hadoop/ hive size



You can follow different options based on your data as follows:


Option1: You can check the full database size (if that DB has no extra tables)

hdfs dfs -du -s -h /path/to/DB


Option2: You can use wild characters (if all the mentioned 100 tables has unique pattern like table name starts with mytable. Ex: mytable1, mytable2, mytable3, etc)

hdfs dfs -du -s -h /path/to/table/mytable*


Option3: if your DB has extra tables and there is no common patter to get only those 100 out of all the tables then you may need to write a script and pass the table name as parameter



New Contributor
Posts: 2
Registered: ‎09-28-2017

Re: hadoop/ hive size

I need to get the individual size of the table but for all the 100 table with one single command. currently im using 

hdfs dfs -du -s /apps/hive/warehouse/database name/tablename to fetch the size of every table which is monotonous