My table in hive have more than 80 million records. Now we have to make a back-up of it. Do you have a best practice to do this? i tried to use my data flow in nifi but it is hanging. Now im trying PUTTY (hive -e 'select * from your_Table' | sed 's/[\t]/,/g' > /home/yourfile.csv), nothing happened when i run it. Im using sandbox hdp 2.5.
Thanks!