Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

impala in hue, analysisexception caused by java.net.ConnectException

avatar
New Contributor

Hello,

 

started the go-grid cluster tutorial.

First the mysql-rights did not work for the SQOOP tutorial, seems the host-specification for user 'retail_dba'@'%' did not work. So I explicitly added the ip-adresses of agents. Sqoop then worked fine, data loaded into HDFS.

 

Then loaded the AVRO metadata into HIVE, this seemed to work fine.

 

When running the IMPALA tutorial, the tables show up after "invalidate metadata" command.

 

But any query or anything else returns exceptions: 

 

AnalysisException: Failed to load metadata for table: default.categories

CAUSED BY: TableLoadingException: Problem reading Avro schema at: hdfs://216......... /user/examples/sqoop_import_categories.avsc

CAUSED BY: ConnectException Call From f6129-cldramaster-01/10.115........ to 216..........:8020 failed on connection exception: java.net.ConnectException: Connection refused; For more detail see: ......... [etc etc etc]

 

 

Tried to run the IMPALA-SHELL , but this resulted with same error:

 

Caused by: com.cloudera.impala.catalog.TableLoadingException: Problem reading Avro schema at: hdfs://216.121.116.82/user/examples/sqoop_import_categories.avsc CAUSED BY: ConnectException: Call From f6129-cldramaster-01/10.105.115.2 to 216.121.116.82:8020 failed on connection exception: java.net.ConnectException: Connection refused; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused CAUSED BY: ConnectException: Connection refused

 

Please help, and push me in right direction, or solution to above problem!

 

(I'm curious as to if the DNS services are working correctly)

 

 

1 ACCEPTED SOLUTION

avatar
New Contributor

Hi,

I've solved the issue jsut by executing the following commands in the hive console:

 

 

DROP TABLE IF EXISTS categories;
DROP TABLE IF EXISTS customers;
DROP TABLE IF EXISTS departments;
DROP TABLE IF EXISTS orders;
DROP TABLE IF EXISTS order_items;
DROP TABLE IF EXISTS products;

CREATE EXTERNAL TABLE categories
ROW FORMAT SERDE 'org.apache.hadoop.hive.serde2.avro.AvroSerDe'
STORED AS INPUTFORMAT 'org.apache.hadoop.hive.ql.io.avro.AvroContainerInputFormat'
OUTPUTFORMAT 'org.apache.hadoop.hive.ql.io.avro.AvroContainerOutputFormat'
LOCATION 'hdfs:///user/hive/warehouse/categories'
TBLPROPERTIES ('avro.schema.url'='hdfs://if3f8-cldramaster-01/user/examples/sqoop_import_categories.avsc');

CREATE EXTERNAL TABLE customers
ROW FORMAT SERDE 'org.apache.hadoop.hive.serde2.avro.AvroSerDe'
STORED AS INPUTFORMAT 'org.apache.hadoop.hive.ql.io.avro.AvroContainerInputFormat'
OUTPUTFORMAT 'org.apache.hadoop.hive.ql.io.avro.AvroContainerOutputFormat'
LOCATION 'hdfs:///user/hive/warehouse/customers'
TBLPROPERTIES ('avro.schema.url'='hdfs://if3f8-cldramaster-01/user/examples/sqoop_import_customers.avsc');

CREATE EXTERNAL TABLE departments
ROW FORMAT SERDE 'org.apache.hadoop.hive.serde2.avro.AvroSerDe'
STORED AS INPUTFORMAT 'org.apache.hadoop.hive.ql.io.avro.AvroContainerInputFormat'
OUTPUTFORMAT 'org.apache.hadoop.hive.ql.io.avro.AvroContainerOutputFormat'
LOCATION 'hdfs:///user/hive/warehouse/departments'
TBLPROPERTIES ('avro.schema.url'='hdfs://if3f8-cldramaster-01/user/examples/sqoop_import_departments.avsc');

CREATE EXTERNAL TABLE orders
ROW FORMAT SERDE 'org.apache.hadoop.hive.serde2.avro.AvroSerDe'
STORED AS INPUTFORMAT 'org.apache.hadoop.hive.ql.io.avro.AvroContainerInputFormat'
OUTPUTFORMAT 'org.apache.hadoop.hive.ql.io.avro.AvroContainerOutputFormat'
LOCATION 'hdfs:///user/hive/warehouse/orders'
TBLPROPERTIES ('avro.schema.url'='hdfs://if3f8-cldramaster-01/user/examples/sqoop_import_orders.avsc');

CREATE EXTERNAL TABLE order_items
ROW FORMAT SERDE 'org.apache.hadoop.hive.serde2.avro.AvroSerDe'
STORED AS INPUTFORMAT 'org.apache.hadoop.hive.ql.io.avro.AvroContainerInputFormat'
OUTPUTFORMAT 'org.apache.hadoop.hive.ql.io.avro.AvroContainerOutputFormat'
LOCATION 'hdfs:///user/hive/warehouse/order_items'
TBLPROPERTIES ('avro.schema.url'='hdfs://if3f8-cldramaster-01/user/examples/sqoop_import_order_items.avsc');

CREATE EXTERNAL TABLE products
ROW FORMAT SERDE 'org.apache.hadoop.hive.serde2.avro.AvroSerDe'
STORED AS INPUTFORMAT 'org.apache.hadoop.hive.ql.io.avro.AvroContainerInputFormat'
OUTPUTFORMAT 'org.apache.hadoop.hive.ql.io.avro.AvroContainerOutputFormat'
LOCATION 'hdfs:///user/hive/warehouse/products'
TBLPROPERTIES ('avro.schema.url'='hdfs://if3f8-cldramaster-01/user/examples/sqoop_import_products.avsc');

 

Thanks for all the suggestions from the community!

View solution in original post

10 REPLIES 10

avatar
New Contributor

Thanks Roberto! And ScottP1971!

 

A few additions to Robertos post for those struggling like I am:

 

1. Replace on the lines starting as: TBLPROPERTIES the "if3f8-cldramaster-01", with your own masternode address (not IP-address but the other one)!

2. start Hive again and run Robertos code with the above changes

3. Start Hue again and run the "invalidate metadata" and then the query that initially gave an error.