Member since
09-12-2016
13
Posts
1
Kudos Received
0
Solutions
12-19-2016
05:33 PM
@harish172were you able to figure out the cause of the exception? Are you able to see all tables including customers,departments,order_items,products under warehouse in the hive query browser. I only see the table categories after the exception was seen. My command used is typed below in the thread.
... View more
12-19-2016
02:44 PM
@mathieu.d Thanks was bale to delete using hadoop fs -rm however I get an exception like hte user above as well, the import however does complete and hadoop fs -ls /user/hive/warehouse/categories shows me all the tables as below [cloudera@quickstart ~]$ hadoop fs -ls /user/hive/warehouse Found 6 items drwxr-xr-x - cloudera supergroup 0 2016-12-19 14:10 /user/hive/warehouse/categories drwxr-xr-x - cloudera supergroup 0 2016-12-19 14:11 /user/hive/warehouse/customers drwxr-xr-x - cloudera supergroup 0 2016-12-19 14:11 /user/hive/warehouse/departments drwxr-xr-x - cloudera supergroup 0 2016-12-19 14:11 /user/hive/warehouse/order_items drwxr-xr-x - cloudera supergroup 0 2016-12-19 14:12 /user/hive/warehouse/orders drwxr-xr-x - cloudera supergroup 0 2016-12-19 14:12 /user/hive/warehouse/products [cloudera@quickstart ~]$ hadoop fs -ls /user/hive/warehouse/categories Found 2 items -rw-r--r-- 1 cloudera supergroup 0 2016-12-19 14:10 /user/hive/warehouse/categories/_SUCCESS -rw-r--r-- 1 cloudera supergroup 1427 2016-12-19 14:10 /user/hive/warehouse/categories/part-m-00000 But in hte hue browser > impala show tables; still shows only the categories table. Any idea what went wrong. Below is the command i used as suggested. sqoop import-all-tables -m 1 --connect jdbc:mysql://quickstart.cloudera:3306/retail_db --username=retail_dba --password=cloudera --compression-codec=snappy --as-sequencefile --warehouse-dir=/user/hive/warehouse --hive-overwrite 16/12/19 14:12:14 WARN hdfs.DFSClient: Caught exception java.lang.InterruptedException at java.lang.Object.wait(Native Method) at java.lang.Thread.join(Thread.java:1281) at java.lang.Thread.join(Thread.java:1355) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeResponder(DFSOutputStream.java:862) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.endBlock(DFSOutputStream.java:600) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:789) 16/12/19 14:12:14 WARN hdfs.DFSClient: Caught exception java.lang.InterruptedException at java.lang.Object.wait(Native Method) at java.lang.Thread.join(Thread.java:1281) at java.lang.Thread.join(Thread.java:1355) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeResponder(DFSOutputStream.java:862) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.endBlock(DFSOutputStream.java:600) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:789) 16/12/19 14:12:15 WARN hdfs.DFSClient: Caught exception java.lang.InterruptedException at java.lang.Object.wait(Native Method) at java.lang.Thread.join(Thread.java:1281) at java.lang.Thread.join(Thread.java:1355) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeResponder(DFSOutputStream.java:862) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.endBlock(DFSOutputStream.java:600) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:789) 16/12/19 14:12:15 INFO db.DBInputFormat: Using read commited transaction isolation 16/12/19 14:12:15 WARN hdfs.DFSClient: Caught exception java.lang.InterruptedException
... View more
12-10-2016
10:21 AM
@csguna hadoop fs -ls /user/hive/warehouse//categories shows only .metadata canot remove that file using hadoop fs -ls rm /user/hive/warehouse/categories/.metadata alsoe Hue > Query editor select * from categories keeps running for a while with no result. also delete from categories in hive query editor says failed attempt to delete using transaction manager does not support these operations. lastly hive in cmd shows nothing hive> show_tables > How do i get rid of categories tables to run the comman as overwrite as you mentioned.
... View more
12-10-2016
01:30 AM
I am using cloudera express so mysql is in quickstart.cloudera. But i keep getting Retrying to connect to server error. mysql is present uncer /var/lib/mysql. I have given sudo chmod 777 /var/lib/mysql as well but no luck. Any idea what is wrong here. mysql gives access denied error for user cloudera@localhost using password no
... View more
09-12-2016
09:10 PM
1 Kudo
Exception in thread "main" org.apache.thrift.transport.TTransportException: Could not create ServerSocket on address 0.0.0.0/0.0.0.0:9083. Have started Hive metastore already using [cloudera@quickstart ~]$ sudo su [root@quickstart cloudera]# service hive-metastore start Starting Hive Metastore (hive-metastore): [ OK ] After whihc the hive health goes red with above erorr in stderr and below in dashboard Bad : The Hive Metastore canary failed to create the hue hdfs home directory. Any idea what I am missing?
... View more
Labels:
- Labels:
-
Apache Hive
-
Cloudera Hue
-
HDFS