Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

Invalid Block token exception while copying data to hdfs

avatar
Explorer

Whenever i try to copy the data to hdfs i get the following exception.

Sometimes the data is copied and sometimes it wont.

16/05/23 01:40:26 INFO hdfs.DFSClient: Exception in createBlockOutputStream org.apache.hadoop.hdfs.security.token.block.InvalidBlockTokenException: Got access token error, status message , ack with firstBadLink as 10.200.146.167:50010 at org.apache.hadoop.hdfs.protocol.datatransfer.DataTransferProtoUtil.checkBlockOpStatus(DataTransferProtoUtil.java:134) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1393) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1295) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:463) 16/05/23 01:40:26 INFO hdfs.DFSClient: Abandoning BP-1475253775-10.200.146.164-1463754036445:blk_1073742227_1403 16/05/23 01:40:26 INFO hdfs.DFSClient: Excluding datanode DatanodeInfoWithStorage[10.200.146.167:50010,DS-3d4a4a18-98eb-40b0-acb2-f1e454a67ee7,DISK] 16/05/23 01:40:26 INFO hdfs.DFSClient: Exception in createBlockOutputStream org.apache.hadoop.hdfs.security.token.block.InvalidBlockTokenException: Got access token error, status message , ack with firstBadLink as 10.200.146.172:50010 at org.apache.hadoop.hdfs.protocol.datatransfer.DataTransferProtoUtil.checkBlockOpStatus(DataTransferProtoUtil.java:134) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1393) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1295) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:463) 16/05/23 01:40:26 INFO hdfs.DFSClient: Abandoning BP-1475253775-10.200.146.164-1463754036445:blk_1073742228_1404 16/05/23 01:40:26 INFO hdfs.DFSClient: Excluding datanode DatanodeInfoWithStorage[10.200.146.172:50010,DS-1b41638c-ddff-4409-9ca0-f8b4ecbb46d6,DISK] 16/05/23 01:40:26 INFO hdfs.DFSClient: Exception in createBlockOutputStream org.apache.hadoop.hdfs.security.token.block.InvalidBlockTokenException: Got access token error, status message , ack with firstBadLink as 10.200.146.168:50010 at org.apache.hadoop.hdfs.protocol.datatransfer.DataTransferProtoUtil.checkBlockOpStatus(DataTransferProtoUtil.java:134) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1393) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1295) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:463) 16/05/23 01:40:26 INFO hdfs.DFSClient: Abandoning BP-1475253775-10.200.146.164-1463754036445:blk_1073742229_1405 16/05/23 01:40:26 INFO hdfs.DFSClient: Excluding datanode DatanodeInfoWithStorage[10.200.146.168:50010,DS-89f60613-85eb-4ec8-a571-f6dee904bc57,DISK] 16/05/23 01:40:26 INFO hdfs.DFSClient: Exception in createBlockOutputStream org.apache.hadoop.hdfs.security.token.block.InvalidBlockTokenException: Got access token error, status message , ack with firstBadLink as 10.200.146.166:50010 at org.apache.hadoop.hdfs.protocol.datatransfer.DataTransferProtoUtil.checkBlockOpStatus(DataTransferProtoUtil.java:134) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1393) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1295) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:463) 16/05/23 01:40:26 INFO hdfs.DFSClient: Abandoning BP-1475253775-10.200.146.164-1463754036445:blk_1073742230_1406 16/05/23 01:40:26 INFO hdfs.DFSClient: Excluding datanode DatanodeInfoWithStorage[10.200.146.166:50010,DS-e19737ba-1f63-444a-b22b-1210c75c6ad5,DISK] 16/05/23 01:40:26 WARN hdfs.DFSClient: DataStreamer Exception java.io.IOException: Unable to create new block. at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1308)       at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:463) 16/05/23 01:40:26 WARN hdfs.DFSClient: Could not get block locations. Source file "/user/root/postgres-xl-9.5r1.tar.bz2._COPYING_" - Aborting... put: Got access token error, status message , ack with firstBadLink as 10.200.146.166:50010

Individual datanode log is as follows:

datanode2:

2016-05-23 00:31:17,766 WARN datanode.DataNode (DataXceiver.java:checkAccess(1311)) - Block token verification failed: op=WRITE_BLOCK, remoteAddress=/10.200.146.173:40315, message=Block token with block_token_identifier (expiryDate=1463939895457, keyId=503794258, userId=root, blockPoolId=BP-1475253775-10.200.146.164-1463754036445, blockId=1073742225, access modes=[WRITE]) is expired. 2016-05-23 00:31:17,766 ERROR datanode.DataNode (DataXceiver.java:run(278)) - HadoopSlave9:50010:DataXceiver error processing WRITE_BLOCK operation src: /10.200.146.173:40315 dst: /10.200.146.173:50010 org.apache.hadoop.security.token.SecretManager$InvalidToken: Block token with block_token_identifier (expiryDate=1463939895457, keyId=503794258, userId=root, blockPoolId=BP-1475253775-10.200.146.164-1463754036445, blockId=1073742225, access modes=[WRITE]) is expired. at org.apache.hadoop.hdfs.security.token.block.BlockTokenSecretManager.checkAccess(BlockTokenSecretManager.java:280) at org.apache.hadoop.hdfs.security.token.block.BlockTokenSecretManager.checkAccess(BlockTokenSecretManager.java:301) at org.apache.hadoop.hdfs.security.token.block.BlockPoolTokenSecretManager.checkAccess(BlockPoolTokenSecretManager.java:97) at org.apache.hadoop.hdfs.server.datanode.DataXceiver.checkAccess(DataXceiver.java:1296) at org.apache.hadoop.hdfs.server.datanode.DataXceiver.writeBlock(DataXceiver.java:629) at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.opWriteBlock(Receiver.java:137) at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.processOp(Receiver.java:74) at org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:251) at java.lang.Thread.run(Thread.java:745)

datanode1:

2016-05-22 13:58:42,364 WARN datanode.DataNode (BlockReceiver.java:run(1389)) - IOException in BlockReceiver.run(): java.nio.channels.ClosedByInterruptException at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:202) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:478) at org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:63) at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:159) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:117) at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) at java.io.DataOutputStream.flush(DataOutputStream.java:123) at org.apache.hadoop.hdfs.server.datanode.BlockReceiver$PacketResponder.sendAckUpstreamUnprotected(BlockReceiver.java:1531) at org.apache.hadoop.hdfs.server.datanode.BlockReceiver$PacketResponder.sendAckUpstream(BlockReceiver.java:1468) at org.apache.hadoop.hdfs.server.datanode.BlockReceiver$PacketResponder.run(BlockReceiver.java:1381) at java.lang.Thread.run(Thread.java:745) 2016-05-22 13:58:42,364 INFO datanode.DataNode (BlockReceiver.java:run(1392)) - PacketResponder: BP-1475253775-10.200.146.164-1463754036445:blk_1073742234_1410, type=HAS_DOWNSTREAM_IN_PIPELINE java.nio.channels.ClosedByInterruptException at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:202) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:478) at org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:63) at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:159) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:117) at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) at java.io.DataOutputStream.flush(DataOutputStream.java:123) at org.apache.hadoop.hdfs.server.datanode.BlockReceiver$PacketResponder.sendAckUpstreamUnprotected(BlockReceiver.java:1531) at org.apache.hadoop.hdfs.server.datanode.BlockReceiver$PacketResponder.sendAckUpstream(BlockReceiver.java:1468) at org.apache.hadoop.hdfs.server.datanode.BlockReceiver$PacketResponder.run(BlockReceiver.java:1381) at java.lang.Thread.run(Thread.java:745) 2016-05-22 13:58:42,364 INFO datanode.DataNode (BlockReceiver.java:run(1406)) - PacketResponder: BP-1475253775-10.200.146.164-1463754036445:blk_1073742234_1410, type=HAS_DOWNSTREAM_IN_PIPELINE terminating 2016-05-22 13:58:42,365 INFO datanode.DataNode (DataXceiver.java:writeBlock(838)) - opWriteBlock BP-1475253775-10.200.146.164-1463754036445:blk_1073742234_1410 received exception java.io.IOException: Premature EOF from inputStream 2016-05-22 13:58:42,365 ERROR datanode.DataNode (DataXceiver.java:run(278)) - HadoopMaster:50010:DataXceiver error processing WRITE_BLOCK operation src: /10.200.146.164:55515 dst: /10.200.146.164:50010 java.io.IOException: Premature EOF from inputStream at org.apache.hadoop.io.IOUtils.readFully(IOUtils.java:201) at org.apache.hadoop.hdfs.protocol.datatransfer.PacketReceiver.doReadFully(PacketReceiver.java:213) at org.apache.hadoop.hdfs.protocol.datatransfer.PacketReceiver.doRead(PacketReceiver.java:134) at org.apache.hadoop.hdfs.protocol.datatransfer.PacketReceiver.receiveNextPacket(PacketReceiver.java:109) at org.apache.hadoop.hdfs.server.datanode.BlockReceiver.receivePacket(BlockReceiver.java:502) at org.apache.hadoop.hdfs.server.datanode.BlockReceiver.receiveBlock(BlockReceiver.java:896) at org.apache.hadoop.hdfs.server.datanode.DataXceiver.writeBlock(DataXceiver.java:805) at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.opWriteBlock(Receiver.java:137) at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.processOp(Receiver.java:74) at org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:251) at java.lang.Thread.run(Thread.java:745)
1 ACCEPTED SOLUTION

avatar
Master Guru
@hari kiran

As this is intermittent, it looks like some of the datanodes are not in sync.

Couple of things to check:

1. Please check /etc/hosts file, it should be in sync on all the datanodes and namenodes if you are not using DNS.

2. Please check if iptables is running on few datanodes, apply a for loop and quickly check on all the datanodes.

3. Please check if time is in sync on all the datanodes. Time on Namenode and Datanodes should be in sync.

View solution in original post

3 REPLIES 3

avatar
Master Guru
@hari kiran

As this is intermittent, it looks like some of the datanodes are not in sync.

Couple of things to check:

1. Please check /etc/hosts file, it should be in sync on all the datanodes and namenodes if you are not using DNS.

2. Please check if iptables is running on few datanodes, apply a for loop and quickly check on all the datanodes.

3. Please check if time is in sync on all the datanodes. Time on Namenode and Datanodes should be in sync.

avatar
Explorer

Thank you @Kuldeep Kulkarni as you said time on all the nodes on the cluster were out of sync.

Changing them did the trick.

Hortonworks community is far far better than cloudera.

You guys quickly respond whenever i am stuck with some serious issues.

avatar

Does it require a restart after changing time & iptable status?