New Contributor
Posts: 1
Registered: ‎02-09-2015

Failed to start datanode due to bind exception

I have been repeatedly trying to start datanode but it fails with bind exception saying address is already in use even though port is free


I used below commands to check

netstat -a -t --numeric-ports -p | grep 500


I have overridden default port 50070 with 50081 but the issue still persists.


Starting DataNode with maxLockedMemory = 0
Opened streaming server at /
Balancing bandwith is 10485760 bytes/s
Number threads for balancing is 5
Waiting for threadgroup to exit, active threads is 0
Shutdown complete.
Exception in secureMain bind(2) error: Address already in use when trying to bind to '/var/run/hdfs-sockets/datanode'
    at Method)
    at org.apache.hadoop.hdfs.server.datanode.DataNode.getDomainPeerServer(
    at org.apache.hadoop.hdfs.server.datanode.DataNode.initDataXceiver(
    at org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(
    at org.apache.hadoop.hdfs.server.datanode.DataNode.<init>(
    at org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(
    at org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(
    at org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(
    at org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(
    at org.apache.hadoop.hdfs.server.datanode.DataNode.main(
Exiting with status 1


New Contributor
Posts: 3
Registered: ‎06-13-2016

Re: Failed to start datanode due to bind exception


Restart cloudera agent on that datanode and then try to restart datanode..Datanode should come up