Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

Exercise 1 Sqoop import fails

SOLVED Go to solution

Re: Exercise 1 Sqoop import fails

Super Collaborator

If you want to "drop" the categories table you should run an hive query like this :

DROP TABLE categories;

 

If you want to "delete" the content of the table only then try "TRUNCATE TABLE categories;". It should work or try deleting the table content in HDFS directly.

 

As for your use of "hadoop fs", you should know that "hadoop fs -ls rm" does not exist.

For deleting HDFS files or folders it is directly "hadoop fs -rm".

Re: Exercise 1 Sqoop import fails

@mathieu.d

Thanks was bale to delete using hadoop fs -rm however I get an exception like hte user above as well, the import however does complete and hadoop fs -ls /user/hive/warehouse/categories shows me all the tables as below

 

[cloudera@quickstart ~]$ hadoop fs -ls /user/hive/warehouse
Found 6 items
drwxr-xr-x - cloudera supergroup 0 2016-12-19 14:10 /user/hive/warehouse/categories
drwxr-xr-x - cloudera supergroup 0 2016-12-19 14:11 /user/hive/warehouse/customers
drwxr-xr-x - cloudera supergroup 0 2016-12-19 14:11 /user/hive/warehouse/departments
drwxr-xr-x - cloudera supergroup 0 2016-12-19 14:11 /user/hive/warehouse/order_items
drwxr-xr-x - cloudera supergroup 0 2016-12-19 14:12 /user/hive/warehouse/orders
drwxr-xr-x - cloudera supergroup 0 2016-12-19  14:12 /user/hive/warehouse/products
[cloudera@quickstart ~]$ hadoop fs -ls /user/hive/warehouse/categories
Found 2 items
-rw-r--r-- 1 cloudera supergroup 0 2016-12-19 14:10 /user/hive/warehouse/categories/_SUCCESS
-rw-r--r-- 1 cloudera supergroup 1427 2016-12-19 14:10 /user/hive/warehouse/categories/part-m-00000

 

But in hte hue browser > impala show tables; still shows only the categories table. Any idea what went wrong. Below is the command i used as suggested.

 

sqoop import-all-tables -m 1 --connect jdbc:mysql://quickstart.cloudera:3306/retail_db --username=retail_dba --password=cloudera --compression-codec=snappy --as-sequencefile --warehouse-dir=/user/hive/warehouse --hive-overwrite

16/12/19 14:12:14 WARN hdfs.DFSClient: Caught exception
java.lang.InterruptedException
at java.lang.Object.wait(Native Method)
at java.lang.Thread.join(Thread.java:1281)
at java.lang.Thread.join(Thread.java:1355)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeResponder(DFSOutputStream.java:862)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.endBlock(DFSOutputStream.java:600)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:789)
16/12/19 14:12:14 WARN hdfs.DFSClient: Caught exception
java.lang.InterruptedException
at java.lang.Object.wait(Native Method)
at java.lang.Thread.join(Thread.java:1281)
at java.lang.Thread.join(Thread.java:1355)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeResponder(DFSOutputStream.java:862)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.endBlock(DFSOutputStream.java:600)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:789)
16/12/19 14:12:15 WARN hdfs.DFSClient: Caught exception
java.lang.InterruptedException
at java.lang.Object.wait(Native Method)
at java.lang.Thread.join(Thread.java:1281)
at java.lang.Thread.join(Thread.java:1355)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeResponder(DFSOutputStream.java:862)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.endBlock(DFSOutputStream.java:600)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:789)
16/12/19 14:12:15 INFO db.DBInputFormat: Using read commited transaction isolation
16/12/19 14:12:15 WARN hdfs.DFSClient: Caught exception
java.lang.InterruptedException

Highlighted

Re: Exercise 1 Sqoop import fails

New Contributor

Do you need the --override?

 

I reran my Tutorial 1 and it didnt append new records....I thought it would...why do you think it allowed it?