Support Questions

Find answers, ask questions, and share your expertise

Error when starting HDFS - Address already in use when trying to bind to '/var/hdfs-sockets/dn'

avatar
Contributor

Hi,

 

I am getting error after installation and I am not able to start HDFS data node.

I am always getting Error:

 

Exception in secureMain
java.net.BindException: bind(2) error: Address already in use when trying to bind to '/var/hdfs-sockets/dn'
 at org.apache.hadoop.net.unix.DomainSocket.bind0(Native Method)
 at org.apache.hadoop.net.unix.DomainSocket.bindAndListen(DomainSocket.java:191)

 

I checked and with netstat I am not getting that something is busy on port 50010 since Data Node is run on 50010

Opened streaming server at /10.0.9.6:50010.

 

I tried bysetting the parameter dfs.domain.socket.path to different paths:

/var/hdfs-sockets/dn

and

/var/hdfs-sockets

 

This folders are creted on NameNode servers but also I created it on DataNode server.

 

I tried by setting it in the ownership of the root user but also and to cloudera-scm user.

And the same error is always thrown.

 

Can someone please provide me answer how to resolve this kind of error wich is always thrown when trying to start HDFS since I am not able to continue further?

 

Thank you in advance,

Veljko

 

 

1 ACCEPTED SOLUTION

avatar
Contributor

This is resolved.

Solution was that last segment is the file and will be created automatically.

It should not be created as dir.

View solution in original post

3 REPLIES 3

avatar
Expert Contributor

Hi @VeljkoC,

 

Two options:

 

1) Changege number of port

2) Kill process pid (sometimes exist ghosts process)

 

Regards,

Manu.

avatar
Contributor

Hi @manuroman

 

can you please provide more details?

1. for chaging the port - were you reffering to Data Node port 50010 or some other port?

Also I am confused here because UNIX domain socket is networkless, which should imply that ports are not relevant?

Please tell me on which exact port you were reffering to?

 

2. for killing the PID can you please tell me on which exact process  you were reffering that should be killed in order to start HDFS? Are you reffering to some process on DataNode or...?

 

Thank you in advance,

Veljko

 

avatar
Contributor

This is resolved.

Solution was that last segment is the file and will be created automatically.

It should not be created as dir.