- Subscribe to RSS Feed
- Mark Question as New
- Mark Question as Read
- Float this Question for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page
Error in exercise 2: exporting tables from MySQL to HDFS
- Labels:
-
Apache Hadoop
-
Apache Hive
-
Apache Sqoop
-
HDFS
Created on ‎10-26-2015 07:33 PM - edited ‎09-16-2022 02:46 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
When I run the following command:
sqoop import-all-tables \ -m 1 \ --connect jdbc:mysql://quickstart:3306/retail_db \ --username=retail_dba \ --password=cloudera \ --compression-codec=snappy \ --as-avrodatafile \ --warehouse-dir=/user/hive/warehouse
I get the following error:
/usr/lib/sqoop/bin/sqoop: line 101: /usr/jars/bin/hadoop: No such file or directory
/usr/lib/sqoop/bin/sqoop: line 101: exec: /usr/jars/bin/hadoop: cannot execute: No such file or directory
I tried to remove that file using:
hadoop fs -rm -r -skipTrash /usr/jars/bin/hadoop
I still get the same error!!!!!
any suggestions????
Created ‎10-27-2015 05:19 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Perhaps this other thread will be of assistance.
tutorial exercise 2: no tables
Cy Jervis, Manager, Community Program
Was your question answered? Make sure to mark the answer as the accepted solution.
If you find a reply useful, say thanks by clicking on the thumbs up button.
Created on ‎10-28-2015 06:04 AM - edited ‎10-28-2015 06:22 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Thanks for following up. Problem is solved; I ran hive server and made the namenode leave the safe mode. Then the command included in the thread you posted worked finally.
Created ‎10-27-2015 05:19 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Perhaps this other thread will be of assistance.
tutorial exercise 2: no tables
Cy Jervis, Manager, Community Program
Was your question answered? Make sure to mark the answer as the accepted solution.
If you find a reply useful, say thanks by clicking on the thumbs up button.
Created ‎10-27-2015 05:15 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Thanks for the suggestion! I tried the suggested commands but I still have the same problem.
Created ‎10-28-2015 04:51 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Sorry to hear that. Hopefully someone more knowledgable than I will jump in with something better.
Cy Jervis, Manager, Community Program
Was your question answered? Make sure to mark the answer as the accepted solution.
If you find a reply useful, say thanks by clicking on the thumbs up button.
Created on ‎10-28-2015 06:04 AM - edited ‎10-28-2015 06:22 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Thanks for following up. Problem is solved; I ran hive server and made the namenode leave the safe mode. Then the command included in the thread you posted worked finally.
Created ‎10-28-2015 06:19 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Cy Jervis, Manager, Community Program
Was your question answered? Make sure to mark the answer as the accepted solution.
If you find a reply useful, say thanks by clicking on the thumbs up button.
