Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

timeout error in connection while fetching the file in HDFS

avatar
Expert Contributor

We are trying to fetch file from HDFS (hdfs://10.140.176.148:9000/),we are able to list the files on HDFS however while fetching the file we are getting below error : hdfs.DFSClient: Failed to connect to /192.168.20.8:50010 for block, add to deadNodes and continue. org.apache.hadoop.net.ConnectTimeoutException: 60000 millis timeout while waiting for channel to be ready for connect. ch : java.nio.channels.SocketChannel[connection-pending remote=/192.168.20.8:50010] it seems the IP to which the request is internally redirected is not reachable from our client servers : Client server : 10.141.80.117 10.141.80.119 10.141.80.128 10.141.80.129 10.143.168.75 Could you please help us to resolve the issue .

3 REPLIES 3

avatar
Expert Contributor

@Jay Kumar SenSharma can you plz help me with that?

avatar
Master Mentor

@hardik desai

Is there any specific reason you are using IP Address instead of Hostnames ? Using Hostnames instead of IP address is always recommended.

.

Also the error shows

Failed to connect to /192.168.20.8:50010

.

So from the machine where you are running the HDFS fetch commands are you able to access the mentioned DataNode host & port?

# telnet  192.168.20.8  50010 
(OR)
# nc -v  192.168.20.8  50010 

.

It might be either Port blocking issue / like firewall issue.

Or the DataNode on the mentioned IPAddress / port might not be listening.

So please SSH to the DataNode host "192.168.20.8" and check if the port is opened and listening on 50010 ?

# netstat -tnlpa | grep -i DataNode

.

If you see that port is not opened OR if it is not bind to the intended network interface (ip address) then please fix it or share the DataNode logs to see if there are any errors

avatar
Expert Contributor

@Jay Kumar SenSharma sure , i will check and update. Thanks!!