09-16-2015 02:10 PM
While trying to execute the Cloudera Live: Exercise 2 example SQL query in Impala using Hue , I am getting the below error.
Your query has the following error(s):
AnalysisException: Failed to load metadata for table: default.order_items CAUSED BY: TableLoadingException: Problem reading Avro schema at: hdfs://220.127.116.11/user/examples/sqoop_import_order_items.avsc CAUSED BY: ConnectException: Call From g5157-cldramaster-01/10.98.189.5 to 18.104.22.168:8020 failed on connection exception: java.net.ConnectException: Connection refused; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused CAUSED BY: ConnectException: Connection refuse
Please help in resolving this issue. Thanks!
09-16-2015 02:23 PM
09-16-2015 07:53 PM
09-16-2015 09:39 PM - edited 09-16-2015 09:40 PM
This error doesn't seem to be transient. Tried it many times since noon and repeatedly getting the same error message.
Also, tried `sudo lsof -i | grep hdfs | grep LISTEN` command and port 8020 is listening.Below was my output.
[root@g5157-cldramaster-01 ~]# sudo lsof -i | grep hdfs | grep LISTEN
java 33597 hdfs 149u IPv4 182417 0t0 TCP g5157-cldramaster-01:50090 (LISTEN)
java 33653 hdfs 139u IPv4 182410 0t0 TCP g5157-cldramaster-01:50070 (LISTEN)
java 33653 hdfs 156u IPv4 182592 0t0 TCP g5157-cldramaster-01:oa-system (LISTEN)
java 33653 hdfs 166u IPv4 182596 0t0 TCP g5157-cldramaster-01:intu-ec-svcdisc (LISTEN)
I didn't get exactly what you meant by private IP is source and public IP is target. Can you please elaborate?
09-17-2015 09:19 AM - edited 09-17-2015 09:20 AM
I misspoke. I thought this was on AWS, but I see now it's on GoGrid. I think the issue is that HDFS is listening on "g5157-cldramaster-01". If you ping that hostname from your machines, does it resolve to a 208.* IP address, or a 10.* IP address? The SQL tables are set up using the 208.* (public) IP address, but I think g5157-cldramaster-01 is bound to the 10.* (private) IP address. I think the tutorial should have used the hostname in the CREATE TABLE statements. I'll need to look into what might go wrong for it not to.
For now, what I would recommend is dropping the tables from the Impala app using the SQL statements below (this should leave the data files Sqoop created in place and just make Hive / Impala forget about the tables as previously created). Then rerun the CREATE TABLE statements but using the hostname g5157-cldramaster-01 instead of the IP address to refer to the Avro schema files.
DROP TABLE categories; DROP TABLE customers; DROP TABLE departments; DROP TABLE orders; DROP TABLE order_items; DROP TABLE product;
That should get the tables the way they should be - but they shouldn't have been wrong in the first place. I'll look into what might've happened...