I have 2 To of data in My hadoop cluster through Hive and I would bring these data in my local server, So I use Hive to perform this task by using beeline CLI as below:
use db1; for i in (T1 T2 T3 ...)do export table $1 to '/tmp/$i'; done
(Notice: maybe you notice same errors in this query above, it's not what I'm looking for, this syntaxe isn't the same I've used, but it's close enough and it works for me, so don't care about this query).
this query is really slow to done this task, So what I'm looking for actually is to know if there is some other solution like using Scoop or (hadoop fs -get /user/hive/warehouse/database.db) or even hive to do this task as fast as possible.
For loops are not a syntax in Hive.
Sqoop "export" command or SparkSQL are alternative solutions to what you are doing, but all solutions will be slow, depending on the size of the database tables. There is only so fast a single CPU and network interface can process data.