Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

NiFi PutHDFS Writing Zero Bytes? (appears to be crypto related)

NiFi PutHDFS Writing Zero Bytes? (appears to be crypto related)

New Contributor

Attempting simple pipeline to read file from local disk, and store it in HDFS (as a simple test-case, my real use case for this is more complex but the error is reproduced with a simple 2 processor flow of GetFile followed by PutHDFS).

  • The source file exists, and is read successfully.
  • In HDFS the target folder does not exist (in the processor create directory is set to true)
  • The destination folder is created successfully, and destination file is created
  • The resulting file always has zero bytes of content.
  • No errors/debugging are thrown in the NiFi UI, and the resulting flowfile is routed to "Success"

Upon investigating the NiFi logs, I see many of the following exceptions being thrown back to back:

2017-01-04 15:10:06,802 INFO [Thread-15] org.apache.hadoop.hdfs.DFSClient Exception in createBlockOutputStream
java.io.IOException: java.security.InvalidKeyException: Illegal key size
at org.apache.hadoop.crypto.JceAesCtrCryptoCodec$JceAesCtrCipher.init(JceAesCtrCryptoCodec.java:116) ~[hadoop-common-2.7.3.jar:na]
at org.apache.hadoop.crypto.CryptoInputStream.updateDecryptor(CryptoInputStream.java:290) ~[hadoop-common-2.7.3.jar:na]
at org.apache.hadoop.crypto.CryptoInputStream.resetStreamOffset(CryptoInputStream.java:303) ~[hadoop-common-2.7.3.jar:na]
at org.apache.hadoop.crypto.CryptoInputStream.<init>(CryptoInputStream.java:128) ~[hadoop-common-2.7.3.jar:na]
at org.apache.hadoop.crypto.CryptoInputStream.<init>(CryptoInputStream.java:109) ~[hadoop-common-2.7.3.jar:na]
at org.apache.hadoop.crypto.CryptoInputStream.<init>(CryptoInputStream.java:133) ~[hadoop-common-2.7.3.jar:na]
at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.DataTransferSaslUtil.createStreamPair(DataTransferSaslUtil.java:345) ~[hadoop-hdfs-2.7.3.jar:na]
at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient.doSaslHandshake(SaslDataTransferClient.java:490) ~[hadoop-hdfs-2.7.3.jar:na]
at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient.getEncryptedStreams(SaslDataTransferClient.java:299) ~[hadoop-hdfs-2.7.3.jar:na]
at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient.send(SaslDataTransferClient.java:242) ~[hadoop-hdfs-2.7.3.jar:na]
at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient.checkTrustAndSend(SaslDataTransferClient.java:211) ~[hadoop-hdfs-2.7.3.jar:na]
at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient.socketSend(SaslDataTransferClient.java:183) ~[hadoop-hdfs-2.7.3.jar:na]
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1314) [hadoop-hdfs-2.7.3.jar:na]
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1262) [hadoop-hdfs-2.7.3.jar:na]
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448) [hadoop-hdfs-2.7.3.jar:na]

2017-01-04 15:10:06,802 INFO [Thread-15] org.apache.hadoop.hdfs.DFSClient Abandoning BP-138799602-162.44.214.77-1447180102573:blk_1128807796_55190832
2017-01-04 15:10:06,825 INFO [Thread-15] org.apache.hadoop.hdfs.DFSClient Excluding datanode DatanodeInfoWithStorage[10.45.56.108:1024,DS-b239b145-6da4-478b-b558-6b87c585f4b7,DISK]
2017-01-04 15:10:06,825 WARN [Thread-15] org.apache.hadoop.hdfs.DFSClient DataStreamer Exception
java.io.IOException: Unable to create new block.

Which ultimately leads to:

2017-01-04 15:10:06,825 WARN [Thread-15] org.apache.hadoop.hdfs.DFSClient Could not get block locations. Source file "full_path_ommitted/test.txt" - Aborting...
2017-01-04 15:10:15,323 INFO [NiFi Web Server-19] o.a.n.controller.StandardProcessorNode Stopping processor: class org.apache.nifi.processors.hadoop.PutHDFS

(which as a side note, should be thrown up to the UI debug log at the least, as right now with debug turned on in the PutHDFS processor, none of this is showing up, not to mention this should NOT be a "success" lol)

Some other relevant information:

  • This is a kerberized hadoop cluster
  • hdfs-site.xml and core-site.xml provided to processor
  • Kerberos principal, and keytab, as well as kerberos config file specified in nifi config before startup.
  • It is in fact authenticating fine (as it creates the folder and file). The issue seems to be a crypto issue when negotiating the block streamer.
  • I did find reference to using the unlimited crypto libs from Oracle (due to export restrictions). I'm in Canada, and downloaded the appropriate unlimited jars, and added them to both my JRE and JDK lib folders respectively, and restarted NiFi
  • NiFi is running from my workstation currently for dev, Windows 7 x64, using JDK8 x64

Any help/assistance/suggestions would be greatly appreciated. Trying to nail down this issue, as it's preventing me from writing any files into HDFS with NiFi.

Thanks!

8 REPLIES 8

Re: NiFi PutHDFS Writing Zero Bytes? (appears to be crypto related)

Paul,

Is there space available in HDFS to write new files? Have you tried to run the command manually outside of NiFi?

What version of NiFi is being used?

Re: NiFi PutHDFS Writing Zero Bytes? (appears to be crypto related)

New Contributor

Hey, thanks for the prompt comment!

There is definitely space available, I was able to manually upload the same files to the same path in HDFS using

hdfs dfs -put 

from the cli...

The NiFi version is (from about in web ui):

11/26/2016 04:39:37 EST
Tagged nifi-1.1.0-RC2
From f61e42c on branch NIFI-3100-rc2

From nifi.apache.org

Re: NiFi PutHDFS Writing Zero Bytes? (appears to be crypto related)

I think this is related to the unlimited crypto libs which I know you said you installed, but is there any chance there are multiple JDKs/JREs, and maybe the "java" that started NiFi still doesn't have them?

I am on a Mac using a JDK, but for me I put them in JDK_HOME/jre/lib/security and by "them" I'm referring to US_export_policy.jar and local_policy.jar.

Re: NiFi PutHDFS Writing Zero Bytes? (appears to be crypto related)

New Contributor

There are 2 JREs on the system, the one bundled with the JDK, and a standalone JRE. I've put it in the lib/security folder of both. (and yes, I have the same files from Oracle, both jars were placed in the appropriate folder).

Re: NiFi PutHDFS Writing Zero Bytes? (appears to be crypto related)

Super Guru
@Paul Mumby

It appears that you know the root cause already. For your JDK on cluster as well as your nifi, have you copied the two files

US_export_policy.jar and local_policy.jar under $JAVA_HOME/jre/lib/security folder?

Re: NiFi PutHDFS Writing Zero Bytes? (appears to be crypto related)

New Contributor

I've definitely installed the policy jars on all JREs on the NiFi system, but the cluster I have no control over, it's administered by another team and I have only user level access to use the cluster, no administrative access, or ability to affect the local JREs on the cluster.

I suspect making the change on the cluster won't be possible (as there are about 30 other projects which are using the cluster actively, and they are not experiencing the problem, the problem only seems to be affecting NiFi, which is running on my local workstation, and is the only instance of NiFi in the cluster)

Ultimately we have a problem to solve which is well suited to NiFi so I wanted to do a proof of concept with NiFi to validate it first. Unfortunately this issue is prohibiting further progress.

Re: NiFi PutHDFS Writing Zero Bytes? (appears to be crypto related)

New Contributor

hi @Paul Mumby,

I am facing similar issue. Can you please let me know if you have resolved the issue.

Re: NiFi PutHDFS Writing Zero Bytes? (appears to be crypto related)

New Contributor

Hi @Paul Mumby

I am facing similar issue. Can you please let me know if you have resolved the issue?

It works two weeks ago, but I got this email when I tested today

I am also in Canada and have the similar settings.

Thanks

Andy

Don't have an account?
Coming from Hortonworks? Activate your account here