Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

Import data directly in as-parquetfile format

avatar
Expert Contributor

Hello, Please I will like to know how to import the data with Sqoop while compressing with as- parquetfile directly . Is it possible? here's what I use:

sqoop import \ -libjars jtds-1.3.1.jar \ --connect "jdbc:jtds:sqlserver://xxxxxxx:xxxx;databaseName=xxxxxx;user=xxxxx;password=xxxxxxx;instance=xxxxx" \ --driver net.sourceforge.jtds.jdbc.Driver \ --username xxxx \ --table xxxx \ --hive-import \ --hive-database xxxxxx \ --hive-table xxxx --as-parquetfile --m 1

Here the error it returns me:
Caused by: org.apache.hadoop.ipc.RemoteException(java.io.FileNotFoundException): File does not exist: /apps/hive/warehouse/"name_table"/.metadata/schemas/1.avsc

Thank you !


1 ACCEPTED SOLUTION

avatar
Expert Contributor
hide-solution

This problem has been solved!

Want to get a detailed solution you have to login/registered on the community

Register/Login
3 REPLIES 3

avatar
Guru

Is this an existing table that is based on avro? From the logs that looks like the case. If you want to import as parquet, try creating a new hive-table with the above command.

avatar
Expert Contributor
hide-solution

This problem has been solved!

Want to get a detailed solution you have to login/registered on the community

Register/Login

avatar
Guru

@alain TSAFACK Please try accepting the answer that answered your question. Avoid accepting your own answers unless you have done your research after asking the question and have an answer.