Member since
12-04-2015
7
Posts
0
Kudos Received
0
Solutions
12-04-2015
05:11 PM
Sean It finally worked. After your inputs and reading several of your replies on other posts, I did the following: 1. On my Cloudera VM i reset the memory to 8GB. I had others I removed (2 other images). 2. I restarted the VM and deleted the files under hive with this command - sudo -u hdfs hadoop fs -rm -r /user/hive/warehouse/\* 3. I redid the sudo comand per tutorial 1 using sqoop import-all-tables \ -m 3 \ --connect jdbc:mysql://208.113.123.213:3306/retail_db \ --username=retail_dba \ --password=cloudera \ --compression-codec=snappy \ --as-avrofile \ --warehouse-dir=/user/hive/warehouse \ --hive-import Note - I used the avro format 4. After job finished, I went to Hive Editor and refreshed Database. After refresh, it showed the 6 tables I was expecting 5. I then went to Impala editor and did the same refresh (after Invalidate metadata; show tables:) 6. Next, I went to the tables and looked at table statistics. That ran a job and I found the table entries started showing up (statistics updated with row, byte size etc). Bottom line, the avro format worked. I have to work on the table right now so I did not use the parquet file format. I shall retry this once my immediate work is done. Without your assistance on ths particular posts and without reading your other posts this would not have been solved I thank you for that and appreciate your help.
... View more
12-04-2015
04:13 PM
Sean - yes, it is a Cloudera Quickstart VM. I deleted CDH 5.4 I was using for 4 - 6 weeks since I had some issue and Windows upgrade. Loading CDH 5.5 is what caused this problem.
... View more
12-04-2015
03:52 PM
Sean I understand. I am trying to emphasize here that the job did finally run without delay or failure. It did put 6 or so tables in the directory. But the job didnt show those tables on Hive or Impala editor. I tried to add the table using the file option but the parquet format didnt seem to convert well for the table to show up. Bottom line, the sqoop job put the file where it should be put but I just dont see it showing up on the editor (database does not reflects those tables have been created, That is why I am asking are there other ways to get the table to show. When I need CM or CM Enterprise I would choose another laptop. Right now I have a regular laptop for my learning and demo purpose.
... View more
12-04-2015
02:54 PM
Sean What is the recommended memory. I have used 3, 4, 6 on CDH 5.4 and even up to 8MB Can i try the sudo code using avrotable instead of parquet option. Let me know.
... View more
12-04-2015
10:54 AM
Sean Should I look in the log files under var/log/hive/ ? Let me know. Here is the file location though for the successful process (for / user/ hive/ warehouse/ products) Name Size User Group Permissions Date cloudera supergroup drwxr-xr-x December 03, 2015 09:47 PM . cloudera supergroup drwxr-xr-x December 03, 2015 09:47 PM .metadata cloudera supergroup drwxr-xr-x December 03, 2015 09:47 PM .signals cloudera supergroup drwxr-xr-x December 03, 2015 09:47 PM 6f7ab0da-3cbf-40ee-a74a-d73683c68c91.parquet 43.8 KB cloudera supergroup -rw-r--r-- December 03, 2015 09:47 PM hive-metastore.log file 2015-12-04 10:12:55,165 WARN [org.apache.hadoop.hive.common.JvmPauseMonitor$Monitor@33267d64]: common.JvmPauseMonitor (JvmPauseMonitor.java:run(188)) - Detected pause in JVM or host machine (eg GC): pause of approximately 12002ms No GCs detected 2015-12-04 10:13:41,510 WARN [org.apache.hadoop.hive.common.JvmPauseMonitor$Monitor@33267d64]: common.JvmPauseMonitor (JvmPauseMonitor.java:run(188)) - Detected pause in JVM or host machine (eg GC): pause of approximately 13720ms No GCs detected
... View more
12-04-2015
10:27 AM
to experts sqoop import-all-tables \ > -m 3 \ > --connect jdbc:mysql://208.113.123.213:3306/retail_db \ > --username=retail_dba \ > --password=cloudera \ > --compression-codec=snappy \ > --as-parquetfile \ > --warehouse-dir=/user/hive/warehouse \ > --hive-import was the original command I used (after cleaning files) should I redo (after deleting connected folders) using sqoop import-all-tables \ > -m 3 \ > --connect jdbc:mysql://208.113.123.213:3306/retail_db \ > --username=retail_dba \ > --password=cloudera \ > --compression-codec=snappy \ > --as-avrofile \ > --warehouse-dir=/user/hive/warehouse \ > --hive-import
... View more
12-04-2015
09:42 AM
I had an issue with CDH 5.4 hanging. I prefered to remove it and re-install the latest version. I downloaded and installed CDH 5.5 and tried to use the Getting Started. After several tries, I was able to (using Community forums) complete my sudo install done correctly. After the /user/hive/warehouse/categories finally were created correctly,. I went to Hive and Impala editor and tried to refresh table after entering Invalidate metadata command. For some reason, I am unable to see the tables, through I see the parquet files with bytes on each of the tables (customers, departments etc) Please help let me know if I am missing anything. Thanks
... View more
Labels: