Support Questions

Find answers, ask questions, and share your expertise

CDH 5.5.0 - Getting Started Tables not visible though install job completed

avatar
Explorer

I had an issue with CDH 5.4 hanging. I prefered to remove it and re-install the latest version.

I downloaded and installed CDH 5.5 and tried to use the Getting Started.

After several tries, I was able to (using Community forums) complete my sudo install done correctly.

 

After the /user/hive/warehouse/categories finally were created correctly,. I went to Hive and Impala editor and tried to refresh table after entering Invalidate metadata command.

 

For some reason, I am unable to see the tables, through I see the parquet files with bytes on each of the tables (customers, departments etc)

 

Please help let me know if I am missing anything.

 

Thanks

1 ACCEPTED SOLUTION

avatar
Guru
To answer your other question though, I wouldn't expect a different data format to make a difference here. There's enough competition for memory on the system that Hive is constantly doing garbage collection, and that shouldn't have anything to do with what format Sqoop is using for the data.

View solution in original post

11 REPLIES 11

avatar
Explorer

Sean - yes, it is a Cloudera Quickstart VM.

 

I deleted CDH 5.4 I was using for 4 - 6 weeks since I had some issue and Windows upgrade.

Loading CDH 5.5 is what caused this problem.

 

 

avatar
Explorer

Sean

It finally worked.

 

After your inputs and reading several of your replies on other posts, I did the following:

 

1. On my Cloudera VM i reset the memory to 8GB. I had others I removed (2 other images).

2. I restarted the VM and deleted the files under hive with this command - sudo -u hdfs hadoop fs -rm -r /user/hive/warehouse/\*

3. I redid the sudo comand per tutorial 1 using

sqoop import-all-tables \
-m 3 \
--connect jdbc:mysql://208.113.123.213:3306/retail_db \
--username=retail_dba \
--password=cloudera \
--compression-codec=snappy \
--as-avrofile \
--warehouse-dir=/user/hive/warehouse \
--hive-import

 

Note - I used the avro format

 

4. After job finished, I went to Hive Editor and refreshed Database. After refresh, it showed the 6 tables I was expecting

 

5. I then went to Impala editor and did the same refresh (after Invalidate metadata; show tables:)

 

6. Next, I went to the tables and looked at table statistics. That ran a job and I found the table entries started showing up (statistics updated with row, byte size etc).

 

Bottom line, the avro format worked.

 

I have to work on the table right now so I did not use the parquet file format. I shall retry this once my immediate work is done.

 

Without your assistance on ths particular posts and without reading your other posts this would not have been solved

I thank you for that and appreciate your help.