Support Questions

Find answers, ask questions, and share your expertise

Sqoop import data from hive to csv.

avatar

Hi,

I need to import the data from an old hive db in a new hive db, on different servers. The old e new system define external table in Hive and fill it through csv files. Can I export the data from the old system using Sqoop in a csv file? In this way, I can define the external table on new system and I will finish my data migration, is it right?

1 ACCEPTED SOLUTION

avatar
Super Guru
@Andrea L

I don't believe that Sqoop supports importing from Hive or exporting to Hive. It is intended as a bridge between Hive and RDBMS. However, you should be able to do what you want.

From within hive, run the following command:

insert overwrite local directory '/home/carter/staging' row format delimited fields terminated by ',' select * from hugetable;

This command will save the results of the select on the table to a file on your local system.

If you want to do it externally from hive, say via the unix command line, you could try this:

hive -e 'select * from your_Table' | sed 's/[\t]/,/g'  > /home/yourfile.csv

The first command will run a query in Hive and pipe it to sed which converts the tab-delimited lines to using a comma and saves it to a csv file. Push this file to HDFS and then you can import that CSV file into the other Hive DB via an external table.

hive -e 'set hive.cli.print.header=true; select * from your_Table' | sed 's/[\t]/,/g'  > /home/yourfile.csv

The second command is similar, but specifies that hive should print the headers.

View solution in original post

2 REPLIES 2

avatar
Super Guru
@Andrea L

I don't believe that Sqoop supports importing from Hive or exporting to Hive. It is intended as a bridge between Hive and RDBMS. However, you should be able to do what you want.

From within hive, run the following command:

insert overwrite local directory '/home/carter/staging' row format delimited fields terminated by ',' select * from hugetable;

This command will save the results of the select on the table to a file on your local system.

If you want to do it externally from hive, say via the unix command line, you could try this:

hive -e 'select * from your_Table' | sed 's/[\t]/,/g'  > /home/yourfile.csv

The first command will run a query in Hive and pipe it to sed which converts the tab-delimited lines to using a comma and saves it to a csv file. Push this file to HDFS and then you can import that CSV file into the other Hive DB via an external table.

hive -e 'set hive.cli.print.header=true; select * from your_Table' | sed 's/[\t]/,/g'  > /home/yourfile.csv

The second command is similar, but specifies that hive should print the headers.

avatar
Expert Contributor

@Andrea L

like Michael Young said, Sqoop doesn't suppot importing from or exporting to Hive.

it's also recommanded to use the export/import hive queries to move your data between two hive, check this out:


https://cwiki.apache.org/confluence/display/Hive/LanguageManual+ImportExport

however, the CSV method can generate problems of separator or even if the data is numerous it would be necessary to group them in one CSV file, which is not reassuring.