Support Questions

Find answers, ask questions, and share your expertise
Announcements
Check out our newest addition to the community, the Cloudera Data Analytics (CDA) group hub.

SQOOP export --hive-import

New Contributor

Is it posible to export files to MSSQL/DB2/ORACLE with --hive-import option?

There seem to be a file part of the ".metadata" missing.

 

Error:

Unable to load descriptor file:hdfs://quickstart.cloudera:8020/user/hive/warehouse/customer/.metadata/descriptor.properties

 

As per checking, the .metadata folder contains the file descriptor.properties when --hive-import is NOT present as a parameter.

 

Also, is it possible to export parquet files?

Thanks to anyone who will answer

1 REPLY 1

New Contributor

May or may not be helpful - I'm exporting tables from the cluster that start in avro format.  I couldn't use the --export-dir parameter for their metadata;  but using --hcatalog-table is working to get export connected up with table metadata so it knows the structure of the data being exported.

 

For me, this is exporting to Postgres - I'd expect the same behavior for any other relational database, as this lookds to me like a question of metadata on the cluster end of the export, rather than the insert/update statements on the relational database.

Take a Tour of the Community
Don't have an account?
Your experience may be limited. Sign in to explore more.