Member since
12-04-2015
7
Posts
0
Kudos Received
0
Solutions
12-04-2015
05:11 PM
Sean It finally worked. After your inputs and reading several of your replies on other posts, I did the following: 1. On my Cloudera VM i reset the memory to 8GB. I had others I removed (2 other images). 2. I restarted the VM and deleted the files under hive with this command - sudo -u hdfs hadoop fs -rm -r /user/hive/warehouse/\* 3. I redid the sudo comand per tutorial 1 using sqoop import-all-tables \ -m 3 \ --connect jdbc:mysql://208.113.123.213:3306/retail_db \ --username=retail_dba \ --password=cloudera \ --compression-codec=snappy \ --as-avrofile \ --warehouse-dir=/user/hive/warehouse \ --hive-import Note - I used the avro format 4. After job finished, I went to Hive Editor and refreshed Database. After refresh, it showed the 6 tables I was expecting 5. I then went to Impala editor and did the same refresh (after Invalidate metadata; show tables:) 6. Next, I went to the tables and looked at table statistics. That ran a job and I found the table entries started showing up (statistics updated with row, byte size etc). Bottom line, the avro format worked. I have to work on the table right now so I did not use the parquet file format. I shall retry this once my immediate work is done. Without your assistance on ths particular posts and without reading your other posts this would not have been solved I thank you for that and appreciate your help.
... View more