Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

sqoop export from hive partitioned parquet table to oracle

avatar
Explorer

Hi gurus,

is it possible to do sqoop export from parquet partitioned hive table to oracle database?

our requirement is to use processed data to legacy system that cannot support hadoop/hive connection, thank you..

3 REPLIES 3

avatar
Champion

@bukangarii

 

as long as you have jdbc connectivity to your legacy system, it is possible to export the parquet hive table to your legacy system

 

please check the sqoop guide document to understand the supporting data types

avatar
New Contributor

This is incorrect.  Sqoop can only export files in an HDFS folder to an Oracle table.  If you have tables in parquet format, you need to first query the parquet table and place the output as delimited text in an HDFS folder.  Then you can use Sqoop export to copy that data to an Oracle table.   Sqoop was built to bring data into HDFS.  Support for export out of HDFS is very limited.

avatar
Explorer

Late reply though. It is indeed possible to export the underlying parquet files of the table with the limitation - 

1. Like other file formats support, BlobRef is not supported.
2. Files are read via Kite SDK. Currently Kite requires .metadata present.

 

https://issues.apache.org/jira/browse/SQOOP-1394