Member since
07-19-2016
4
Posts
0
Kudos Received
0
Solutions
05-22-2017
03:36 PM
Do you use zookeeper user in order to write data on hdfs? Has the zookeeper user all the rights for that?
... View more
05-22-2017
12:45 PM
I think your error is related to some kerberos issue. Is your ticket valid during all the copyFromLocal time span?
... View more
07-20-2016
11:47 AM
Thanks @SBandaru. Just to share my experience. The problem was on the flume side. The flume agents went in OutOfMemoryError (unable to create new native thread) and the impact on the hdfs had been the error posted above. So I think there is a relation between the two errors but I agree with @SBandaru we can ignore the error message in the datanode logs. Ciao Ettore
... View more
07-19-2016
08:34 PM
We are using HDP-2.3.4.0. When we injest data in hdfs using 10 flume agents all the datanodes start to log the following error messages after 10-15 minutes 2016-07-19 18:19:31,144 INFO datanode.DataNode (DataXceiver.java:writeBlock(655)) - Receiving BP-1264119021-192.168.2.1-1454492758635:blk_1074098375_358091 src: /192.168.2.8:36648 dest: /192.168.2.16:1019
2016-07-19 18:19:31,211 INFO datanode.DataNode (DataXceiver.java:writeBlock(655)) - Receiving BP-1264119021-192.168.2.1-1454492758635:blk_1074098376_358092 src: /192.168.2.8:36649 dest: /192.168.2.16:1019
2016-07-19 18:19:31,298 INFO datanode.DataNode (DataXceiver.java:writeBlock(655)) - Receiving BP-1264119021-192.168.2.1-1454492758635:blk_1074098377_358093 src: /192.168.2.8:36650 dest: /192.168.2.16:1019
2016-07-19 18:19:31,553 INFO datanode.DataNode (DataXceiver.java:writeBlock(655)) - Receiving BP-1264119021-192.168.2.1-1454492758635:blk_1074098378_358094 src: /192.168.2.8:36651 dest: /192.168.2.16:1019
2016-07-19 18:19:31,597 INFO datanode.DataNode (DataXceiver.java:writeBlock(655)) - Receiving BP-1264119021-192.168.2.1-1454492758635:blk_1074098379_358095 src: /192.168.2.8:36652 dest: /192.168.2.16:1019
2016-07-19 18:19:31,946 INFO datanode.DataNode (DataXceiver.java:writeBlock(655)) - Receiving BP-1264119021-192.168.2.1-1454492758635:blk_1074098381_358097 src: /192.168.2.11:42313 dest: /192.168.2.16:1019
2016-07-19 18:19:33,134 INFO datanode.DataNode (DataXceiver.java:writeBlock(655)) - Receiving BP-1264119021-192.168.2.1-1454492758635:blk_1074098385_358101 src: /192.168.2.6:53766 dest: /192.168.2.16:1019
2016-07-19 18:19:33,153 INFO datanode.DataNode (BlockReceiver.java:receiveBlock(934)) - Exception for BP-1264119021-192.168.2.1-1454492758635:blk_1074098385_358101
java.io.IOException: Premature EOF from inputStream
at org.apache.hadoop.io.IOUtils.readFully(IOUtils.java:201)
at org.apache.hadoop.hdfs.protocol.datatransfer.PacketReceiver.doReadFully(PacketReceiver.java:213)
at org.apache.hadoop.hdfs.protocol.datatransfer.PacketReceiver.doRead(PacketReceiver.java:134)
at org.apache.hadoop.hdfs.protocol.datatransfer.PacketReceiver.receiveNextPacket(PacketReceiver.java:109)
at org.apache.hadoop.hdfs.server.datanode.BlockReceiver.receivePacket(BlockReceiver.java:501)
at org.apache.hadoop.hdfs.server.datanode.BlockReceiver.receiveBlock(BlockReceiver.java:895)
at org.apache.hadoop.hdfs.server.datanode.DataXceiver.writeBlock(DataXceiver.java:807)
at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.opWriteBlock(Receiver.java:137)
at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.processOp(Receiver.java:74)
at org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:251)
at java.lang.Thread.run(Thread.java:745)
2016-07-19 18:19:33,154 INFO datanode.DataNode (BlockReceiver.java:run(1369)) - PacketResponder: BP-1264119021-192.168.2.1-1454492758635:blk_1074098385_358101, type=HAS_DOWNSTREAM_IN_PIPELINE: Thread is interrupted.
2016-07-19 18:19:33,154 INFO datanode.DataNode (BlockReceiver.java:run(1405)) - PacketResponder: BP-1264119021-192.168.2.1-1454492758635:blk_1074098385_358101, type=HAS_DOWNSTREAM_IN_PIPELINE terminating
2016-07-19 18:19:33,155 INFO datanode.DataNode (DataXceiver.java:writeBlock(840)) - opWriteBlock BP-1264119021-192.168.2.1-1454492758635:blk_1074098385_358101 received exception java.io.IOException: Premature EOF from inputStream
2016-07-19 18:19:33,155 ERROR datanode.DataNode (DataXceiver.java:run(278)) - socsds018rm001.sods.local:1019:DataXceiver error processing WRITE_BLOCK operation src: /192.168.2.6:53766 dst: /192.168.2.16:1019
java.io.IOException: Premature EOF from inputStream
at org.apache.hadoop.io.IOUtils.readFully(IOUtils.java:201)
at org.apache.hadoop.hdfs.protocol.datatransfer.PacketReceiver.doReadFully(PacketReceiver.java:213)
at org.apache.hadoop.hdfs.protocol.datatransfer.PacketReceiver.doRead(PacketReceiver.java:134)
at org.apache.hadoop.hdfs.protocol.datatransfer.PacketReceiver.receiveNextPacket(PacketReceiver.java:109)
at org.apache.hadoop.hdfs.server.datanode.BlockReceiver.receivePacket(BlockReceiver.java:501)
at org.apache.hadoop.hdfs.server.datanode.BlockReceiver.receiveBlock(BlockReceiver.java:895)
at org.apache.hadoop.hdfs.server.datanode.DataXceiver.writeBlock(DataXceiver.java:807)
at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.opWriteBlock(Receiver.java:137)
at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.processOp(Receiver.java:74)
at org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:251)
at java.lang.Thread.run(Thread.java:745)
2016-07-19 18:19:33,472 INFO datanode.DataNode (DataXceiver.java:writeBlock(655)) - Receiving BP-1264119021-192.168.2.1-1454492758635:blk_1074098386_358102 src: /192.168.2.5:53758 dest: /192.168.2.16:1019
2016-07-19 18:19:33,489 INFO datanode.DataNode (BlockReceiver.java:receiveBlock(934)) - Exception for BP-1264119021-192.168.2.1-1454492758635:blk_1074098386_358102
java.io.IOException: Premature EOF from inputStream
at org.apache.hadoop.io.IOUtils.readFully(IOUtils.java:201)
at org.apache.hadoop.hdfs.protocol.datatransfer.PacketReceiver.doReadFully(PacketReceiver.java:213)
at org.apache.hadoop.hdfs.protocol.datatransfer.PacketReceiver.doRead(PacketReceiver.java:134)
at org.apache.hadoop.hdfs.protocol.datatransfer.PacketReceiver.receiveNextPacket(PacketReceiver.java:109)
at org.apache.hadoop.hdfs.server.datanode.BlockReceiver.receivePacket(BlockReceiver.java:501)
at org.apache.hadoop.hdfs.server.datanode.BlockReceiver.receiveBlock(BlockReceiver.java:895)
at org.apache.hadoop.hdfs.server.datanode.DataXceiver.writeBlock(DataXceiver.java:807)
at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.opWriteBlock(Receiver.java:137)
at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.processOp(Receiver.java:74)
at org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:251)
at java.lang.Thread.run(Thread.java:745)
2016-07-19 18:19:33,490 INFO datanode.DataNode (BlockReceiver.java:run(1369)) - PacketResponder: BP-1264119021-192.168.2.1-1454492758635:blk_1074098386_358102, type=HAS_DOWNSTREAM_IN_PIPELINE: Thread is interrupted.
2016-07-19 18:19:33,490 INFO datanode.DataNode (BlockReceiver.java:run(1405)) - PacketResponder: BP-1264119021-192.168.2.1-1454492758635:blk_1074098386_358102, type=HAS_DOWNSTREAM_IN_PIPELINE terminating
No obvious errors in the namenode logs
... View more
Labels:
- Labels:
-
Apache Flume
-
Apache Hadoop