- Subscribe to RSS Feed
- Mark Question as New
- Mark Question as Read
- Float this Question for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page
Is it possible to import csv file data into oracle database using sqoop
- Labels:
-
Apache Sqoop
Created ‎03-09-2017 11:42 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I want to load a csv data into oracle database table using sqoop
Created ‎03-09-2017 03:04 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
try this I found here
http://ingest.tips/2015/02/06/use-sqoop-transfer-csv-data-local-filesystem-relational-database/
Using Sqoop1, the data above can be exported to the children table with the following query:
sqoop export -fs local -jt local --connect jdbc:mysql://example.com/example --username example --password example --table test --export-dir file:///tmp/data
NOTES
- -fs local and -jt local are used to reference the local file system and make sqoop run a local MapReduce job
- The URI prefix file:/// is useful for accessing the local file system
- /tmp/data/sample.csv should have the same number of columns as the table and map to the appropriate columns based on order
- You may need to provide -libjars if you’re getting ClassNotFound exceptions.
Created ‎03-10-2017 05:43 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I don't think that Sqoop supports importing from Hive or exporting to Hive. It is intended as a bridge between Hive and RDBMS. However, you should be able to do,
From within hive, run the following command:
insert overwrite local directory '/home/user/staging' row format delimited fields terminated by','select*from table;
This command will save the results of the select on the table to a file on your local system, then export csv to Oracle.
Created ‎03-10-2017 05:49 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi Sunile,
Yes i have tried with MySQL but want to know if its possible to migrate the data to Oracle DB using Sqoop
Created ‎03-15-2017 06:34 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Yes, its possible to import data between Hadoop datastores (HDFS, Hive, HBase) and Oracle. Here is a sample command for import data from HDFS to Oracle.
sqoop export --connect jdbc:oracle:thin:@oradb.example.com:1521:ORCL --table bar --username user --password passwd --export-dir /user/test/data
Above command assumes that the csv data file to be exported into Oracle is in the HDFS folder /user/test/data and the table "bar" exists in Oracle. Also the csv data and its column sequence matches with that of the Oracle table.
