Created 03-09-2017 11:42 AM
I want to load a csv data into oracle database table using sqoop
Created 03-09-2017 03:04 PM
try this I found here
http://ingest.tips/2015/02/06/use-sqoop-transfer-csv-data-local-filesystem-relational-database/
Using Sqoop1, the data above can be exported to the children table with the following query:
sqoop export -fs local -jt local --connect jdbc:mysql://example.com/example --username example --password example --table test --export-dir file:///tmp/data
Created 03-10-2017 05:43 AM
I don't think that Sqoop supports importing from Hive or exporting to Hive. It is intended as a bridge between Hive and RDBMS. However, you should be able to do,
From within hive, run the following command:
insert overwrite local directory '/home/user/staging' row format delimited fields terminated by','select*from table;
This command will save the results of the select on the table to a file on your local system, then export csv to Oracle.
Created 03-10-2017 05:49 AM
Hi Sunile,
Yes i have tried with MySQL but want to know if its possible to migrate the data to Oracle DB using Sqoop
Created 03-15-2017 06:34 AM
Yes, its possible to import data between Hadoop datastores (HDFS, Hive, HBase) and Oracle. Here is a sample command for import data from HDFS to Oracle.
sqoop export --connect jdbc:oracle:thin:@oradb.example.com:1521:ORCL --table bar --username user --password passwd --export-dir /user/test/data
Above command assumes that the csv data file to be exported into Oracle is in the HDFS folder /user/test/data and the table "bar" exists in Oracle. Also the csv data and its column sequence matches with that of the Oracle table.