Member since
05-13-2022
8
Posts
0
Kudos Received
0
Solutions
06-29-2022
03:51 AM
@MattWho - so we have the file coming into the system via a loadbalancer which due to other intricacies, is configured to only ingest on one data node. We don't have the option of an SFTP server so I have to figure this out on the canvas.
... View more
06-29-2022
03:15 AM
So I have a single file coming into a single data node (one of 7 as part of a cluster). This file I need to fetch onto NiFi and process through the flow so that a copy is placed on all available data nodes. I update the attributes to change the files location on the destination and have a put file to place it there. I have since included a duplicateflowfile processor to copy the file 6 times (total of 7 including the original) but with round robin on the connector, this isn't distributing it across the data nodes correctly. I am now looking at adding a distributeload processor after the duplicateflowfile and configuring it to direct at 7 individual putfile processors. However, unsure of how I can configure these to place on a specific data node. Can i include the hostname in the directory field?
... View more
06-13-2022
02:12 AM
Current setup is as below. Basically to reiterate the concept. We wish to receive the single file uploaded on a single data node (1 of 7), process that through NiFi to present an individual copy of that file on every data node. Currently I am getting a multiple created by the duplicateflowfile processor but this is not being placed on the individual nodes.
... View more
06-10-2022
03:58 AM
I've put in the duplicateflowfile processor as shown in the screenshot above. However it is still only sending to a single Data Node. Is there a step or configuration I'm missing?
... View more
06-10-2022
03:57 AM
06-10-2022
03:28 AM
The team logs into the system but cannot guarantee which data node in the cluster they connect to. Currently the flow we have places the file (with the updated attributes - user name and file location) on the same data node it arrived on. What we want to do is take the original file, duplicate it, and place a copy on all nodes so that when the end user logs in they can access the file. No matter what node they end up on. I have managed to edit the flow using the duplicateflowfile mentioned below but haven't managed to get the putfile processor to direct the copies one to each data node.
... View more
06-06-2022
09:16 AM
The file is placed by a third party onto an SFTP location on one node, but we want the file to be available on all nodes so that the receiving team can access it. We'll then run a job to remove it on a daily basis prior to the next file being deposited.
... View more
06-06-2022
07:53 AM
I have the issue whereby I receive a file into NiFi on a single node but wish to copy this onto multiple nodes. Has anyone done this before and what would be the process to do so?
... View more
Labels:
- Labels:
-
Apache NiFi
-
HDFS