Support Questions

Find answers, ask questions, and share your expertise

DB Connection from Impala

avatar
Explorer

Hi,

 

As I continued to work through the Cloudera Live tutorial, I am trying to run SELECT statements from Impala. When I try to do this I get the error:

 

AnalysisException: Failed to load metadata for table: default.order_items CAUSED BY: TableLoadingException: Problem reading Avro schema at: hdfs://216.121.84.2/user/examples/sqoop_import_order_items.avsc CAUSED BY: ConnectException: Call From g2316-cldramaster-01/10.103.62.2 to 216.121.84.2:8020 failed on connection exception: java.net.ConnectException: Connection refused; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused CAUSED BY: ConnectException: Connection refused

 

I saw a similar post to this but the solution didn't seem to apply to my situation.  This occurs in Exercise 2.

 

Any ideas?

 

Thanks,

 

 

Tom

1 ACCEPTED SOLUTION

avatar
New Contributor

Thanks - got that !

 

For all who have similar issue - pls do following

 

DROP TABLE IF EXISTS categories;
DROP TABLE IF EXISTS customers;
DROP TABLE IF EXISTS departments;
DROP TABLE IF EXISTS orders;
DROP TABLE IF EXISTS order_items;
DROP TABLE IF EXISTS products;

 

......then exit your hive shell, and do 'hostname -i'

 

replace the IP address to the one you see from above command.

 

Should be fine.

View solution in original post

7 REPLIES 7

avatar
Guru

Can you confirm in Cloudera Manager that the HDFS service is running and healthy? If the service is marked in any color other than green, there should be a little warning icon that you can click on to get any information about what may be wrong.

 

If the service is healthy, can you tell me what happens when you run "hadoop fs -ls /user/examples/sqoop_import_order_items.avsc" from the command line on a machine in your cluster?

avatar
Explorer

Hi Sean,

 

Thanks for the help. HDFS is 'green' in Cloudera Manager. When I run the Sqoop  command you requested  I get the following:

 

-rw-r--r--   2 root supergroup        980 2015-03-22 20:57 /user/examples/sqoop_import_order_items.avsc

 

 

I am still looking too. Would much rather figure this out on my own but I'm really struggling with this one.

 

 

Tom

 

 

 

avatar
New Contributor

I too have the same issue when running query in Hive shell or HUE Impala editor, please advise

 

Your query has the following error(s):

AnalysisException: Failed to load metadata for table: default.order_items CAUSED BY: TableLoadingException: Problem reading Avro schema at: hdfs://216.121.78.162/user/examples/sqoop_import_order_items.avsc CAUSED BY: ConnectException: Call From i8936-cldramaster-01/10.102.231.2 to 216.121.78.162:8020 failed on connection exception: java.net.ConnectException: Connection refused; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused CAUSED BY: ConnectException: Connection refused

 

HDFS is working OK.

 

[root@i8936-cldramaster-01 ~]# hadoop fs -ls /user/examples/sqoop_import_order_items.avsc
-rw-r--r--   2 root supergroup        980 2015-03-25 07:08 /user/examples/sqoop_import_order_items.avsc

avatar
Explorer

Looks just like what I'm getting.

avatar
New Contributor

The problem is with public and private IP addresses: do not use the public but the private in 

TBLPROPERTIES ('avro.schema.url'='hdfs://10.10.10.10 ....

avatar
New Contributor

Thanks - got that !

 

For all who have similar issue - pls do following

 

DROP TABLE IF EXISTS categories;
DROP TABLE IF EXISTS customers;
DROP TABLE IF EXISTS departments;
DROP TABLE IF EXISTS orders;
DROP TABLE IF EXISTS order_items;
DROP TABLE IF EXISTS products;

 

......then exit your hive shell, and do 'hostname -i'

 

replace the IP address to the one you see from above command.

 

Should be fine.

avatar
Explorer

Wow. That did it. Thanks for the help.