Support Questions
Find answers, ask questions, and share your expertise

Not ready to serve the block pool

Not ready to serve the block pool

I am getting this error on one of my worker nodes that has HDFS and Yarn

In manager, I see unexpected exits due to outOfMemory errors. In configuration for this node, I do not see overcomitted memory error. Am I missing out on something? How do I fix this?

ip-172-31-10-74.ap-south-1.compute.internal:50010:DataXceiver error processing WRITE_BLOCK operation  src: /172.31.5.201:49024 dst: /172.31.10.74:50010
java.io.IOException: Not ready to serve the block pool, BP-1423177047-172.31.4.192-1492091038346.
	at org.apache.hadoop.hdfs.server.datanode.DataXceiver.checkAndWaitForBP(DataXceiver.java:1290)
	at org.apache.hadoop.hdfs.server.datanode.DataXceiver.checkAccess(DataXceiver.java:1298)
	at org.apache.hadoop.hdfs.server.datanode.DataXceiver.writeBlock(DataXceiver.java:630)
	at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.opWriteBlock(Receiver.java:169)
	at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.processOp(Receiver.java:106)
	at org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:246)
	at java.lang.Thread.run(Thread.java:745)
1 REPLY 1

Re: Not ready to serve the block pool

New Contributor

Has this problem been solved?