Support Questions
Find answers, ask questions, and share your expertise

hadoop/ hive size

hadoop/ hive size

Explorer

 

Hive has one database which has 100 tables. How do I find the size of those 100 tables using a particular script? I know that (hdfs dfs -du -s -h /path/to/table/name) is used for fetching size for 1 table. What if i need to fetch the size of all the 100 tables at one single go(one single script)?

2 REPLIES 2
Highlighted

Re: hadoop/ hive size

Champion

@shrimenon

 

You can follow different options based on your data as follows:

 

Option1: You can check the full database size (if that DB has no extra tables)

hdfs dfs -du -s -h /path/to/DB

 

Option2: You can use wild characters (if all the mentioned 100 tables has unique pattern like table name starts with mytable. Ex: mytable1, mytable2, mytable3, etc)

hdfs dfs -du -s -h /path/to/table/mytable*

 

Option3: if your DB has extra tables and there is no common patter to get only those 100 out of all the tables then you may need to write a script and pass the table name as parameter

 

 

Highlighted

Re: hadoop/ hive size

Explorer

I need to get the individual size of the table but for all the 100 table with one single command. currently im using 

hdfs dfs -du -s /apps/hive/warehouse/database name/tablename to fetch the size of every table which is monotonous