Created on 04-11-2018 02:22 AM - edited 09-16-2022 06:05 AM
Hi gurus,
is it possible to do sqoop export from parquet partitioned hive table to oracle database?
our requirement is to use processed data to legacy system that cannot support hadoop/hive connection, thank you..
Created 04-11-2018 05:08 AM
as long as you have jdbc connectivity to your legacy system, it is possible to export the parquet hive table to your legacy system
please check the sqoop guide document to understand the supporting data types
Created 06-27-2018 08:21 AM
This is incorrect. Sqoop can only export files in an HDFS folder to an Oracle table. If you have tables in parquet format, you need to first query the parquet table and place the output as delimited text in an HDFS folder. Then you can use Sqoop export to copy that data to an Oracle table. Sqoop was built to bring data into HDFS. Support for export out of HDFS is very limited.
Created 02-27-2019 02:53 AM
Late reply though. It is indeed possible to export the underlying parquet files of the table with the limitation -
1. Like other file formats support, BlobRef is not supported.
2. Files are read via Kite SDK. Currently Kite requires .metadata present.