Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

Not ready to serve the block pool

Highlighted

Not ready to serve the block pool

I am getting this error on one of my worker nodes that has HDFS and Yarn

In manager, I see unexpected exits due to outOfMemory errors. In configuration for this node, I do not see overcomitted memory error. Am I missing out on something? How do I fix this?

ip-172-31-10-74.ap-south-1.compute.internal:50010:DataXceiver error processing WRITE_BLOCK operation  src: /172.31.5.201:49024 dst: /172.31.10.74:50010
java.io.IOException: Not ready to serve the block pool, BP-1423177047-172.31.4.192-1492091038346.
	at org.apache.hadoop.hdfs.server.datanode.DataXceiver.checkAndWaitForBP(DataXceiver.java:1290)
	at org.apache.hadoop.hdfs.server.datanode.DataXceiver.checkAccess(DataXceiver.java:1298)
	at org.apache.hadoop.hdfs.server.datanode.DataXceiver.writeBlock(DataXceiver.java:630)
	at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.opWriteBlock(Receiver.java:169)
	at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.processOp(Receiver.java:106)
	at org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:246)
	at java.lang.Thread.run(Thread.java:745)
1 REPLY 1
Highlighted

Re: Not ready to serve the block pool

New Contributor

Has this problem been solved?

Don't have an account?
Coming from Hortonworks? Activate your account here