Support Questions

Find answers, ask questions, and share your expertise

This DataNode is not connected to one or more of its NameNode(s).

avatar

No configuration changed when I started getting

  • Namenode Connectivity: IssueThis DataNode is not connected to one or more of its NameNode(s).
  • Web server status: The Cloudera Manager Agent is not able to communicate with this role's web server.

Datanode is not connected to one or more of its Namenode.

Also, I start getting web server status error that Cloudera agent is not getting a response from its web server role.

This is what the log looks like:
dwh-worker-4.c.abc-1225.internal ERROR September 12, 2018 5:33 PM DataNode
dwh-worker-4.c.abc-1225.internal:50010:DataXceiver error processing WRITE_BLOCK operation src: /172.31.10.74:44280 dst: /172.31.10.74:50010
java.io.IOException: Not ready to serve the block pool, BP-1423177047-172.31.4.192-1492091038346.
at org.apache.hadoop.hdfs.server.datanode.DataXceiver.checkAndWaitForBP(DataXceiver.java:1290)
at org.apache.hadoop.hdfs.server.datanode.DataXceiver.checkAccess(DataXceiver.java:1298)
at org.apache.hadoop.hdfs.server.datanode.DataXceiver.writeBlock(DataXceiver.java:630)
at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.opWriteBlock(Receiver.java:169)
at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.processOp(Receiver.java:106)
at org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:246)
at java.lang.Thread.run(Thread.java:745)

also, the data nodes are randomly exiting:

dwh-worker-1.c.abc-1225.internal:50010:DataXceiver error processing WRITE_BLOCK operation  src: /172.31.10.74:49848 dst: /172.31.4.147:50010
java.io.IOException: Not ready to serve the block pool, BP-1423177047-172.31.4.192-1492091038346.
	at org.apache.hadoop.hdfs.server.datanode.DataXceiver.checkAndWaitForBP(DataXceiver.java:1290)
	at org.apache.hadoop.hdfs.server.datanode.DataXceiver.checkAccess(DataXceiver.java:1298)
	at org.apache.hadoop.hdfs.server.datanode.DataXceiver.writeBlock(DataXceiver.java:630)
	at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.opWriteBlock(Receiver.java:169)
	at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.processOp(Receiver.java:106)
	at org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:246)
	at java.lang.Thread.run(Thread.java:745)
3 REPLIES 3

avatar
New Contributor

Has this problem been solved?

avatar
New Contributor

Has this problem been solved?

avatar
Super Collaborator