Reply
New Contributor
Posts: 4
Registered: ‎04-11-2017

sqoop export from hive partitioned parquet table to oracle

[ Edited ]

Hi gurus,

is it possible to do sqoop export from parquet partitioned hive table to oracle database?

our requirement is to use processed data to legacy system that cannot support hadoop/hive connection, thank you..

Highlighted
Posts: 512
Topics: 14
Kudos: 85
Solutions: 45
Registered: ‎09-02-2016

Re: sqoop export from hive partitioned parquet table to oracle

@bukangarii

 

as long as you have jdbc connectivity to your legacy system, it is possible to export the parquet hive table to your legacy system

 

please check the sqoop guide document to understand the supporting data types

New Contributor
Posts: 3
Registered: ‎02-14-2018

Re: sqoop export from hive partitioned parquet table to oracle

This is incorrect.  Sqoop can only export files in an HDFS folder to an Oracle table.  If you have tables in parquet format, you need to first query the parquet table and place the output as delimited text in an HDFS folder.  Then you can use Sqoop export to copy that data to an Oracle table.   Sqoop was built to bring data into HDFS.  Support for export out of HDFS is very limited.

Announcements