<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>question Re: Invalid Block token exception while copying data to hdfs in Archives of Support Questions (Read Only)</title>
    <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Invalid-Block-token-exception-while-copying-data-to-hdfs/m-p/161180#M29220</link>
    <description>&lt;P&gt;Does it require a restart after changing time &amp;amp; iptable status?&lt;/P&gt;</description>
    <pubDate>Thu, 25 May 2017 13:07:47 GMT</pubDate>
    <dc:creator>ranjithap7576</dc:creator>
    <dc:date>2017-05-25T13:07:47Z</dc:date>
    <item>
      <title>Invalid Block token exception while copying data to hdfs</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Invalid-Block-token-exception-while-copying-data-to-hdfs/m-p/161177#M29217</link>
      <description>&lt;P&gt;Whenever i try to copy the data to hdfs i get the following exception.&lt;/P&gt;&lt;P&gt;Sometimes the data is copied and sometimes it wont.&lt;/P&gt;&lt;PRE&gt;16/05/23 01:40:26 INFO hdfs.DFSClient: Exception in createBlockOutputStream org.apache.hadoop.hdfs.security.token.block.InvalidBlockTokenException: Got access token error, status message , ack with firstBadLink as 10.200.146.167:50010 at org.apache.hadoop.hdfs.protocol.datatransfer.DataTransferProtoUtil.checkBlockOpStatus(DataTransferProtoUtil.java:134) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1393) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1295) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:463) 16/05/23 01:40:26 INFO hdfs.DFSClient: Abandoning BP-1475253775-10.200.146.164-1463754036445:blk_1073742227_1403 16/05/23 01:40:26 INFO hdfs.DFSClient: Excluding datanode DatanodeInfoWithStorage[10.200.146.167:50010,DS-3d4a4a18-98eb-40b0-acb2-f1e454a67ee7,DISK] 16/05/23 01:40:26 INFO hdfs.DFSClient: Exception in createBlockOutputStream org.apache.hadoop.hdfs.security.token.block.InvalidBlockTokenException: Got access token error, status message , ack with firstBadLink as 10.200.146.172:50010 at org.apache.hadoop.hdfs.protocol.datatransfer.DataTransferProtoUtil.checkBlockOpStatus(DataTransferProtoUtil.java:134) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1393) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1295) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:463) 16/05/23 01:40:26 INFO hdfs.DFSClient: Abandoning BP-1475253775-10.200.146.164-1463754036445:blk_1073742228_1404 16/05/23 01:40:26 INFO hdfs.DFSClient: Excluding datanode DatanodeInfoWithStorage[10.200.146.172:50010,DS-1b41638c-ddff-4409-9ca0-f8b4ecbb46d6,DISK] 16/05/23 01:40:26 INFO hdfs.DFSClient: Exception in createBlockOutputStream org.apache.hadoop.hdfs.security.token.block.InvalidBlockTokenException: Got access token error, status message , ack with firstBadLink as 10.200.146.168:50010 at org.apache.hadoop.hdfs.protocol.datatransfer.DataTransferProtoUtil.checkBlockOpStatus(DataTransferProtoUtil.java:134) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1393) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1295) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:463) 16/05/23 01:40:26 INFO hdfs.DFSClient: Abandoning BP-1475253775-10.200.146.164-1463754036445:blk_1073742229_1405 16/05/23 01:40:26 INFO hdfs.DFSClient: Excluding datanode DatanodeInfoWithStorage[10.200.146.168:50010,DS-89f60613-85eb-4ec8-a571-f6dee904bc57,DISK] 16/05/23 01:40:26 INFO hdfs.DFSClient: Exception in createBlockOutputStream org.apache.hadoop.hdfs.security.token.block.InvalidBlockTokenException: Got access token error, status message , ack with firstBadLink as 10.200.146.166:50010 at org.apache.hadoop.hdfs.protocol.datatransfer.DataTransferProtoUtil.checkBlockOpStatus(DataTransferProtoUtil.java:134) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1393) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1295) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:463) 16/05/23 01:40:26 INFO hdfs.DFSClient: Abandoning BP-1475253775-10.200.146.164-1463754036445:blk_1073742230_1406 16/05/23 01:40:26 INFO hdfs.DFSClient: Excluding datanode DatanodeInfoWithStorage[10.200.146.166:50010,DS-e19737ba-1f63-444a-b22b-1210c75c6ad5,DISK] 16/05/23 01:40:26 WARN hdfs.DFSClient: DataStreamer Exception java.io.IOException: Unable to create new block. at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1308)       at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:463) 16/05/23 01:40:26 WARN hdfs.DFSClient: Could not get block locations. Source file "/user/root/postgres-xl-9.5r1.tar.bz2._COPYING_" - Aborting... put: Got access token error, status message , ack with firstBadLink as 10.200.146.166:50010&lt;/PRE&gt;&lt;P&gt;Individual datanode log is as follows:&lt;/P&gt;&lt;P&gt;datanode2:&lt;/P&gt;&lt;PRE&gt;2016-05-23 00:31:17,766 WARN datanode.DataNode (DataXceiver.java:checkAccess(1311)) - Block token verification failed: op=WRITE_BLOCK, remoteAddress=/10.200.146.173:40315, message=Block token with block_token_identifier (expiryDate=1463939895457, keyId=503794258, userId=root, blockPoolId=BP-1475253775-10.200.146.164-1463754036445, blockId=1073742225, access modes=[WRITE]) is expired. 2016-05-23 00:31:17,766 ERROR datanode.DataNode (DataXceiver.java:run(278)) - HadoopSlave9:50010:DataXceiver error processing WRITE_BLOCK operation src: /10.200.146.173:40315 dst: /10.200.146.173:50010 org.apache.hadoop.security.token.SecretManager$InvalidToken: Block token with block_token_identifier (expiryDate=1463939895457, keyId=503794258, userId=root, blockPoolId=BP-1475253775-10.200.146.164-1463754036445, blockId=1073742225, access modes=[WRITE]) is expired. at org.apache.hadoop.hdfs.security.token.block.BlockTokenSecretManager.checkAccess(BlockTokenSecretManager.java:280) at org.apache.hadoop.hdfs.security.token.block.BlockTokenSecretManager.checkAccess(BlockTokenSecretManager.java:301) at org.apache.hadoop.hdfs.security.token.block.BlockPoolTokenSecretManager.checkAccess(BlockPoolTokenSecretManager.java:97) at org.apache.hadoop.hdfs.server.datanode.DataXceiver.checkAccess(DataXceiver.java:1296) at org.apache.hadoop.hdfs.server.datanode.DataXceiver.writeBlock(DataXceiver.java:629) at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.opWriteBlock(Receiver.java:137) at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.processOp(Receiver.java:74) at org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:251) at java.lang.Thread.run(Thread.java:745)&lt;/PRE&gt;&lt;P&gt;datanode1:&lt;/P&gt;&lt;PRE&gt;2016-05-22 13:58:42,364 WARN datanode.DataNode (BlockReceiver.java:run(1389)) - IOException in BlockReceiver.run(): java.nio.channels.ClosedByInterruptException at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:202) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:478) at org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:63) at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:159) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:117) at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) at java.io.DataOutputStream.flush(DataOutputStream.java:123) at org.apache.hadoop.hdfs.server.datanode.BlockReceiver$PacketResponder.sendAckUpstreamUnprotected(BlockReceiver.java:1531) at org.apache.hadoop.hdfs.server.datanode.BlockReceiver$PacketResponder.sendAckUpstream(BlockReceiver.java:1468) at org.apache.hadoop.hdfs.server.datanode.BlockReceiver$PacketResponder.run(BlockReceiver.java:1381) at java.lang.Thread.run(Thread.java:745) 2016-05-22 13:58:42,364 INFO datanode.DataNode (BlockReceiver.java:run(1392)) - PacketResponder: BP-1475253775-10.200.146.164-1463754036445:blk_1073742234_1410, type=HAS_DOWNSTREAM_IN_PIPELINE java.nio.channels.ClosedByInterruptException at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:202) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:478) at org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:63) at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:159) at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:117) at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) at java.io.DataOutputStream.flush(DataOutputStream.java:123) at org.apache.hadoop.hdfs.server.datanode.BlockReceiver$PacketResponder.sendAckUpstreamUnprotected(BlockReceiver.java:1531) at org.apache.hadoop.hdfs.server.datanode.BlockReceiver$PacketResponder.sendAckUpstream(BlockReceiver.java:1468) at org.apache.hadoop.hdfs.server.datanode.BlockReceiver$PacketResponder.run(BlockReceiver.java:1381) at java.lang.Thread.run(Thread.java:745) 2016-05-22 13:58:42,364 INFO datanode.DataNode (BlockReceiver.java:run(1406)) - PacketResponder: BP-1475253775-10.200.146.164-1463754036445:blk_1073742234_1410, type=HAS_DOWNSTREAM_IN_PIPELINE terminating 2016-05-22 13:58:42,365 INFO datanode.DataNode (DataXceiver.java:writeBlock(838)) - opWriteBlock BP-1475253775-10.200.146.164-1463754036445:blk_1073742234_1410 received exception java.io.IOException: Premature EOF from inputStream 2016-05-22 13:58:42,365 ERROR datanode.DataNode (DataXceiver.java:run(278)) - HadoopMaster:50010:DataXceiver error processing WRITE_BLOCK operation src: /10.200.146.164:55515 dst: /10.200.146.164:50010 java.io.IOException: Premature EOF from inputStream at org.apache.hadoop.io.IOUtils.readFully(IOUtils.java:201) at org.apache.hadoop.hdfs.protocol.datatransfer.PacketReceiver.doReadFully(PacketReceiver.java:213) at org.apache.hadoop.hdfs.protocol.datatransfer.PacketReceiver.doRead(PacketReceiver.java:134) at org.apache.hadoop.hdfs.protocol.datatransfer.PacketReceiver.receiveNextPacket(PacketReceiver.java:109) at org.apache.hadoop.hdfs.server.datanode.BlockReceiver.receivePacket(BlockReceiver.java:502) at org.apache.hadoop.hdfs.server.datanode.BlockReceiver.receiveBlock(BlockReceiver.java:896) at org.apache.hadoop.hdfs.server.datanode.DataXceiver.writeBlock(DataXceiver.java:805) at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.opWriteBlock(Receiver.java:137) at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.processOp(Receiver.java:74) at org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:251) at java.lang.Thread.run(Thread.java:745)&lt;/PRE&gt;</description>
      <pubDate>Sun, 22 May 2016 21:20:03 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Invalid-Block-token-exception-while-copying-data-to-hdfs/m-p/161177#M29217</guid>
      <dc:creator>hari1</dc:creator>
      <dc:date>2016-05-22T21:20:03Z</dc:date>
    </item>
    <item>
      <title>Re: Invalid Block token exception while copying data to hdfs</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Invalid-Block-token-exception-while-copying-data-to-hdfs/m-p/161178#M29218</link>
      <description>&lt;A rel="user" href="https://community.cloudera.com/users/1348/hari.html" nodeid="1348"&gt;@hari kiran&lt;/A&gt;&lt;P&gt;As this is intermittent, it looks like some of the datanodes are not in sync.&lt;/P&gt;Couple of things to check:&lt;P&gt;1. Please check /etc/hosts file, it should be in sync on all the datanodes and namenodes if you are not using DNS.&lt;/P&gt;&lt;P&gt;2. Please check if iptables is running on few datanodes, apply a for loop and quickly check on all the datanodes.&lt;/P&gt;&lt;P&gt;3. Please check if time is in sync on all the datanodes. Time on Namenode and Datanodes should be in sync.&lt;/P&gt;</description>
      <pubDate>Sun, 22 May 2016 21:28:43 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Invalid-Block-token-exception-while-copying-data-to-hdfs/m-p/161178#M29218</guid>
      <dc:creator>KuldeepK</dc:creator>
      <dc:date>2016-05-22T21:28:43Z</dc:date>
    </item>
    <item>
      <title>Re: Invalid Block token exception while copying data to hdfs</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Invalid-Block-token-exception-while-copying-data-to-hdfs/m-p/161179#M29219</link>
      <description>&lt;P&gt;Thank you @Kuldeep Kulkarni as you said time on all the nodes on the cluster were out of sync.&lt;/P&gt;&lt;P&gt;Changing them did the trick.&lt;/P&gt;&lt;P&gt;Hortonworks community is far far better than cloudera.&lt;/P&gt;&lt;P&gt;You guys quickly respond whenever i am stuck with some serious issues.&lt;/P&gt;</description>
      <pubDate>Mon, 23 May 2016 13:08:07 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Invalid-Block-token-exception-while-copying-data-to-hdfs/m-p/161179#M29219</guid>
      <dc:creator>hari1</dc:creator>
      <dc:date>2016-05-23T13:08:07Z</dc:date>
    </item>
    <item>
      <title>Re: Invalid Block token exception while copying data to hdfs</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Invalid-Block-token-exception-while-copying-data-to-hdfs/m-p/161180#M29220</link>
      <description>&lt;P&gt;Does it require a restart after changing time &amp;amp; iptable status?&lt;/P&gt;</description>
      <pubDate>Thu, 25 May 2017 13:07:47 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Invalid-Block-token-exception-while-copying-data-to-hdfs/m-p/161180#M29220</guid>
      <dc:creator>ranjithap7576</dc:creator>
      <dc:date>2017-05-25T13:07:47Z</dc:date>
    </item>
  </channel>
</rss>

