Member since
09-27-2016
22
Posts
11
Kudos Received
1
Solution
My Accepted Solutions
Title | Views | Posted |
---|---|---|
21544 | 10-18-2016 10:01 AM |
06-02-2018
09:55 PM
Can you please share what exact known issues you are referring to?
... View more
06-29-2017
02:36 PM
Hi @Riccardo Iacomini, I found those two examples quite helpful: - https://github.com/geniuszhe/ambari-mongodb-cluster - https://github.com/abajwa-hw/ntpd-stack I suppose you have your own services, if so, I'd create my own repo (rpm repo for centos e.g. as in the mongodb example) where you have your services packaged and ready to install.
... View more
04-05-2017
01:36 PM
The Mover will move blocks within the same node when possible and thus try to avoid network activity. If that is not possible (e.g. when a node doesn't have SSD or when the local SSDs are full), it will move block replicas across the network to another node that has the target media. I've edited my answer.
... View more
03-28-2017
10:51 AM
Hi @Matt Burgess, thank you for the reply. Sorry to answer this late, but I clearly missed your post. Just in case someone runs into this, I wish to link a mirror question on S.O. that can be useful.
... View more
10-11-2016
02:59 PM
Thanks for sharing your knowledge, I will try your tips. This specific GC issue is happening only when I assign multiple threads to the processors and try to speed up the flow, that otherwise runs at roughly 10MB/s in single thread.
I originally designed the flow to use flowfile attributes cause I was tempted to make the computation happen in memory. I thought that it would have been faster with respect to reading the flowfile content in each processor, and consequently parsing it to get specific fields. Do you suggest trying to implement a version that works, let's say, "on disk" on flowfile content instead of attributes?
... View more
08-02-2019
02:25 PM
@Riccardo Iacomini Thank you for the great post! This is very helpful. Here I am wondering how you batch things together like having many csv rows instead of one csv row. Because if we want to batch csv row into multiple rows, we use MergeContent processor, but you also mention that MergeContent is costly. So how batch processing will work on Nifi??
... View more