- Subscribe to RSS Feed
- Mark Question as New
- Mark Question as Read
- Float this Question for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page
Import data directly in as-parquetfile format
- Labels:
-
Apache Hadoop
-
Apache Hive
-
Apache Sqoop
Created on ‎04-25-2016 03:10 PM - edited ‎09-16-2022 03:15 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hello, Please I will like to know how to import the data with Sqoop while compressing with as- parquetfile directly . Is it possible? here's what I use:
sqoop import \ -libjars jtds-1.3.1.jar \ --connect "jdbc:jtds:sqlserver://xxxxxxx:xxxx;databaseName=xxxxxx;user=xxxxx;password=xxxxxxx;instance=xxxxx" \ --driver net.sourceforge.jtds.jdbc.Driver \ --username xxxx \ --table xxxx \ --hive-import \ --hive-database xxxxxx \ --hive-table xxxx --as-parquetfile --m 1
Here the error it returns me: Caused by: org.apache.hadoop.ipc.RemoteException(java.io.FileNotFoundException): File does not exist: /apps/hive/warehouse/"name_table"/.metadata/schemas/1.avsc Thank you !
Created ‎05-09-2016 10:10 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Ravi Mutyala thank you for your answer
Created ‎04-25-2016 05:01 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Is this an existing table that is based on avro? From the logs that looks like the case. If you want to import as parquet, try creating a new hive-table with the above command.
Created ‎05-09-2016 10:10 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Ravi Mutyala thank you for your answer
Created ‎05-13-2016 06:52 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
@alain TSAFACK Please try accepting the answer that answered your question. Avoid accepting your own answers unless you have done your research after asking the question and have an answer.
