Member since
07-30-2019
155
Posts
107
Kudos Received
33
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
7018 | 04-18-2019 08:27 PM | |
2356 | 12-31-2018 07:36 PM | |
4120 | 12-03-2018 06:47 PM | |
1348 | 06-02-2018 02:35 AM | |
3431 | 06-02-2018 01:30 AM |
10-27-2016
12:22 AM
3 Kudos
Mark,
The certificate you purchased from a certificate authority will identify the NiFi application. Depending on the format it is in (likely a *.key file containing the private key which never left your computer and a *.pem or *.der file containing the corresponding public key, which was then signed via a CSR (Certificate Signing Request) sent to the CA), you will need to build the following files:
Keystore
This will contain the private key and public key certificate with the issuing CA's public certificate in a chain (as a privateKeyEntry) [see example output below]
Truststore
This will contain the public key of your client certificate (if using one) in order to authenticate you as a user connecting to the UI/API.
Alternate example using keytool :
You generate a public/private keypair using the Java keytool :
$ keytool -genkey -alias nifi -keyalg RSA -keysize 2048 -keystore keystore.jks
You then export a certificate signing request which you send to the certificate authority:
$ keytool -certreq -alias nifi -keyalg RSA -file nifi.csr -keystore keystore.jks
You will get a CSR file nifi.csr which you send to the CA, and they provide a signed public certificate (and the public certificate of the CA) back cert_from_ca.pem :
$ keytool -import -trustcacerts -alias nifi -file cert_from_ca.pem -keystore keystore.jks Here is a link to the full steps I ran (I ran my own CA in another terminal to simulate the actions of the external CA) and the resulting output.
... View more
10-25-2016
06:17 PM
2 Kudos
As NiFi uses Jetty internally for its web server capabilities, you could try using a
HeaderPatternRule as described here to enable HSTS , which forces only HTTPS connections. Browsers respond to the provided Strict-Transport-Security header and know to attempt an HTTPS connection.
This isn't directly supported by NiFi though, so you would have to modify code in the application. There is an existing
Apache Jira (NIFI-2437) for this to be enabled through a NiFi configuration setting.
... View more
10-14-2016
04:07 PM
Frank, Yes, you are correct that replacing the flow definition requires restarting the server. The VR wiki page is a work in progress, as is the development effort. @Yolanda M. Davis has done significant work on this and more information is available in the Getting Started Guide and Admin Guide.
... View more
10-13-2016
11:06 PM
4 Kudos
Hi Frank,
The development/QA/production environment
promotion process (sometimes referred to as "SDLC" or "D2P" in conversation) is a topic of much discussion amongst the HDF development team. Currently, there are plans to improve this process in a future release. For now, I will discuss some common behaviors/workflows that we have seen.
The $NIFI_HOME/conf/flow.xml.gz file contains the entire flow serialized to XML. This file contains all processor configuration values, even sensitive values (encrypted). With the new Variable Registry effort, you can refer to environment-specific variables transparently, and promote the same flow between environments without having to update specific values in the flow itself.
The XML flow definition or specific templates can be committed and versioned using Git (or any other source code control tool). Recent improvements like "deterministic template diffs" have made this versioning easier.
The NiFi REST API can be used to "automate" the deployment of a template or flow to an instance of NiFi.
A script (Groovy, Python, etc.) could be used to integrate with both your source code control tool and your various NiFi instances to semi-automate this process (i.e. tap into Git hooks detecting a commit, and promote automatically to the next environment), but you probably want some human interaction to remain for verification at each state.
We understand that the current state of NiFi is not ideal for the promotion of the flow between dev/QA/prod environments. There are ongoing efforts to improve this, but I can't describe anything concrete at this time. If these points raise specific questions or you think of something else, please follow up.
... View more
10-10-2016
07:11 PM
1 Kudo
Hi,
I'm assuming that you are using multiple capture groups to extract each piece of information. Can you explain what "it is not working" looks like in your situation? Is it capturing nothing, capturing different values than you expected, or throwing an exception? One possibility is that your expression is not focused enough -- if that is the complete expression, it would capture "133" first (as well as "199" and "040" before getting to "200"). If you know the log format will remain consistent, you might want to try something like
HTTP\/\d\.\d" (\d{3}) . Please let us know if you have any more information and if this solves your problem.
Update: I tested this expression and was able to get the following output:
--------------------------------------------------
Standard FlowFile Attributes
Key: 'entryDate'
Value: 'Mon Oct 10 12:18:27 PDT 2016'
Key: 'lineageStartDate'
Value: 'Mon Oct 10 12:18:27 PDT 2016'
Key: 'fileSize'
Value: '115'
FlowFile Attribute Map Content
Key: 'HTTP response'
Value: '200'
Key: 'HTTP response.0'
Value: 'HTTP/1.0" 200'
Key: 'HTTP response.1'
Value: '200'
Key: 'filename'
Value: '787130965602970'
Key: 'path'
Value: './'
Key: 'uuid'
Value: 'ccb6f333-de33-4037-9a1a-aa9ce7f2ef32'
--------------------------------------------------
133.43.96.45 - - [01/Aug/1995:00:00:16 -0400] "GET /shuttle/missions/sts-69/mission-sts-69.html HTTP/1.0" 200 10566
I uploaded the template I used here: ExtractText Regex Template.
... View more
09-28-2016
06:47 PM
2 Kudos
Riccardo, I'm sorry you were having this problem, but I just wanted to say thank you for writing such a complete and detailed question. Providing the list of things you have already tried and specific expectations makes answering it much easier for everyone involved. It definitely cuts down on the mental gymnastics of trying to estimate the experience level and comprehension of a user we haven't communicated with before.
... View more
09-26-2016
06:18 PM
I would not recommend using haveged without fully understanding the issue of getting sufficiently unpredictable random input for security purposes. Multiple well-credentialed security experts have weighed in with concerned, if not dismissive, responses.
Michael Kerrisk:
Having read a number of papers about HAVEGE, Peter [Anvin] said he had been unable to work out whether this was a "real thing". Most of the papers that he has read run along the lines, "we took the output from HAVEGE, and ran some tests on it and all of the tests passed". The problem with this sort of reasoning is the point that Peter made earlier: there are no tests for randomness, only for non-randomness.
One of Peter's colleagues replaced the random input source employed by HAVEGE with a constant stream of ones. All of the same tests passed. In other words, all that the test results are guaranteeing is that the HAVEGE developers have built a very good PRNG. It is possible that HAVEGE does generate some amount of randomness, Peter said. But the problem is that the proposed source of randomness is simply too complex to analyze; thus it is not possible to make a definitive statement about whether it is truly producing randomness. (By contrast, the HWRNGs that Peter described earlier have been analyzed to produce a quantum theoretical justification that they are producing true randomness.) "So, while I can't really recommend it, I can't not recommend it either." If you are going to run HAVEGE, Peter strongly recommended running it together with rngd, rather than as a replacement for it.
Tom Leek:
Of course, the whole premise of HAVEGE is questionable. For any practical security, you need a few "real random" bits, no more than 200, which you use as seed in a cryptographically secure PRNG. The PRNG will produce gigabytes of pseudo-[data] indistinguishable from true randomness, and that's good enough for all practical purposes.
Insisting on going back to the hardware for every bit looks like yet another outbreak of that flawed idea which sees entropy as a kind of gasoline, which you burn up when you look at it.
I would recommend directing the JVM to read from /dev/urandom . In response to the concerns above, I'm not sure what "It's not guaranteed to always work" means, but the other issues are mitigated by providing a Java parameter in conf/bootstrap.conf .
... View more
09-22-2016
07:26 PM
If you prefer not to follow this path, you could connect together an EvaluateJsonPath processor and an UpdateAttribute processor and use the Expression Language mathematical operators to calculate these values as well.
... View more
09-22-2016
07:22 PM
3 Kudos
Hi @Obaid Salikeen, My suggestion would be to use ExecuteScript processor with your scripting language of choice (Groovy, Python, Ruby, Scala, Javascript, and Lua are all supported). With Groovy, for example, this would be approximately 4 lines -- use a JsonSlurper to parse the JSON and extract the value(s) you are interested in, then use any combination of collect, sum, average, etc. to perform the desired mathematical operation, and return this in the OutputStream .
@Matt Burgess has some good examples of using ExecuteScript with JSON arrays here.
... View more
09-20-2016
06:48 PM
That is literally the use case for MiNiFi. MiNiFi runs with far fewer resources necessary and no UI, but can accept flows designed using the NiFi UI on the same or separate machine.
... View more