Archives of Support Questions (Read Only)

This is an archived board for historical reference. Information and links may no longer be available or relevant
Announcements
This board is archived and read-only for historical reference. To ask a new question, please post a new topic on the appropriate active board.

Import data directly in as-parquetfile format

avatar
Expert Contributor

Hello, Please I will like to know how to import the data with Sqoop while compressing with as- parquetfile directly . Is it possible? here's what I use:

sqoop import \ -libjars jtds-1.3.1.jar \ --connect "jdbc:jtds:sqlserver://xxxxxxx:xxxx;databaseName=xxxxxx;user=xxxxx;password=xxxxxxx;instance=xxxxx" \ --driver net.sourceforge.jtds.jdbc.Driver \ --username xxxx \ --table xxxx \ --hive-import \ --hive-database xxxxxx \ --hive-table xxxx --as-parquetfile --m 1

Here the error it returns me:
Caused by: org.apache.hadoop.ipc.RemoteException(java.io.FileNotFoundException): File does not exist: /apps/hive/warehouse/"name_table"/.metadata/schemas/1.avsc

Thank you !


1 ACCEPTED SOLUTION

avatar
Expert Contributor

Ravi Mutyala thank you for your answer

View solution in original post

3 REPLIES 3

avatar
Guru

Is this an existing table that is based on avro? From the logs that looks like the case. If you want to import as parquet, try creating a new hive-table with the above command.

avatar
Expert Contributor

Ravi Mutyala thank you for your answer

avatar
Guru

@alain TSAFACK Please try accepting the answer that answered your question. Avoid accepting your own answers unless you have done your research after asking the question and have an answer.