Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

Error in exercise 2: exporting tables from MySQL to HDFS

avatar
Explorer

When I run the following command:

sqoop import-all-tables \
    -m 1 \
    --connect jdbc:mysql://quickstart:3306/retail_db \
    --username=retail_dba \
    --password=cloudera \
    --compression-codec=snappy \
    --as-avrodatafile \
    --warehouse-dir=/user/hive/warehouse

I get the following error:

/usr/lib/sqoop/bin/sqoop: line 101: /usr/jars/bin/hadoop: No such file or directory
/usr/lib/sqoop/bin/sqoop: line 101: exec: /usr/jars/bin/hadoop: cannot execute: No such file or directory

 

I tried to remove that file using:

hadoop fs -rm -r -skipTrash /usr/jars/bin/hadoop

 

I still get the same error!!!!!

any suggestions????

 

2 ACCEPTED SOLUTIONS

avatar
Community Manager

Perhaps this other thread will be of assistance.  

 

tutorial exercise 2: no tables


Cy Jervis, Manager, Community Program
Was your question answered? Make sure to mark the answer as the accepted solution.
If you find a reply useful, say thanks by clicking on the thumbs up button.

View solution in original post

avatar
Explorer

Thanks for following up. Problem is solved; I ran hive server and made the namenode leave the safe mode. Then the command included in the thread you posted worked finally.

View solution in original post

5 REPLIES 5

avatar
Community Manager

Perhaps this other thread will be of assistance.  

 

tutorial exercise 2: no tables


Cy Jervis, Manager, Community Program
Was your question answered? Make sure to mark the answer as the accepted solution.
If you find a reply useful, say thanks by clicking on the thumbs up button.

avatar
Explorer

Thanks for the suggestion! I tried the suggested commands but I still have the same problem. 

avatar
Community Manager

Sorry to hear that. Hopefully someone more knowledgable than I will jump in with something better. 


Cy Jervis, Manager, Community Program
Was your question answered? Make sure to mark the answer as the accepted solution.
If you find a reply useful, say thanks by clicking on the thumbs up button.

avatar
Explorer

Thanks for following up. Problem is solved; I ran hive server and made the namenode leave the safe mode. Then the command included in the thread you posted worked finally.

avatar
Community Manager
Great! Feel free to mark your last comment as the solution. Perhaps it will help others as well.

Cy Jervis, Manager, Community Program
Was your question answered? Make sure to mark the answer as the accepted solution.
If you find a reply useful, say thanks by clicking on the thumbs up button.