Support Questions

Find answers, ask questions, and share your expertise
Announcements
Check out our newest addition to the community, the Cloudera Data Analytics (CDA) group hub.

Apache NiFi PutParquet Error

Explorer

Any idea on why we get this error while writing files to HDFS using PutParquet Processor?


at oracle.jdbc.driver.T4CTTIoer11.processError(T4CTTIoer11.java:498)
... 28 common frames omitted
2020-06-10 12:20:24,729 INFO [NiFi Web Server-495935] o.a.n.c.s.StandardProcessScheduler Starting PutParquet[id=92ceb89d-0172-1000-35ad-31dcdeb7b51d]
2020-06-10 12:20:24,729 INFO [NiFi Web Server-495935] o.a.n.controller.StandardProcessorNode Starting PutParquet[id=92ceb89d-0172-1000-35ad-31dcdeb7b51d]
2020-06-10 12:20:24,784 INFO [Timer-Driven Process Thread-7] o.a.hadoop.security.UserGroupInformation Login successful for user 13010840 using keytab file /home/aiadmin/aiadmin.keytab
2020-06-10 12:20:24,791 INFO [Timer-Driven Process Thread-7] o.a.n.c.s.TimerDrivenSchedulingAgent Scheduled PutParquet[id=92ceb89d-0172-1000-35ad-31dcdeb7b51d] to run with 1 threads
2020-06-10 12:20:24,885 ERROR [reader] net.schmizz.sshj.transport.TransportImpl Dying because - Broken transport; encountered EOF
net.schmizz.sshj.transport.TransportException: Broken transport; encountered EOF
at net.schmizz.sshj.transport.Reader.run(Reader.java:57)
2020-06-10 12:20:24,938 INFO [Thread-513233] o.a.h.h.p.d.sasl.SaslDataTransferClient SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2020-06-10 12:20:24,940 INFO [Thread-513233] org.apache.hadoop.hdfs.DataStreamer Exception in createBlockOutputStream blk_2184945431_1111306988
java.io.EOFException: null
at java.io.DataInputStream.readByte(DataInputStream.java:267)
at org.apache.hadoop.io.WritableUtils.readVLong(WritableUtils.java:308)
at org.apache.hadoop.io.WritableUtils.readVInt(WritableUtils.java:329)
at org.apache.hadoop.hdfs.security.token.block.BlockTokenIdentifier.readFieldsLegacy(BlockTokenIdentifier.java:240)
at org.apache.hadoop.hdfs.security.token.block.BlockTokenIdentifier.readFields(BlockTokenIdentifier.java:221)
at org.apache.hadoop.security.token.Token.decodeIdentifier(Token.java:200)
at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient.doSaslHandshake(SaslDataTransferClient.java:530)
at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient.getEncryptedStreams(SaslDataTransferClient.java:342)
at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient.send(SaslDataTransferClient.java:276)
at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient.checkTrustAndSend(SaslDataTransferClient.java:245)
at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient.socketSend(SaslDataTransferClient.java:203)
at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient.socketSend(SaslDataTransferClient.java:193)
at org.apache.hadoop.hdfs.DataStreamer.createBlockOutputStream(DataStreamer.java:1731)
at org.apache.hadoop.hdfs.DataStreamer.nextBlockOutputStream(DataS
at org.apache.hadoop.hdfs.DataStreamer.run(DataStreamer.java:716)

PutParquet[id=92ceb89d-0172-1000-35ad-31dcdeb7b51d] Failed to write due to Could not get block locations. Source file "<<directory>>.27037720888019658" - Aborting...block==null: java.io.IOException: Could not get block locations. Source file "<<directory>>.27037720888019658" - Aborting...block==null

2 REPLIES 2

New Contributor

Did you manage to solve the issue?

New Contributor

Hi,

  Can you give mode details on the NiFi version, the HDFS Hadoop version. Is NiFi externally installed to HDFS.

Take a Tour of the Community
Don't have an account?
Your experience may be limited. Sign in to explore more.