Member since
02-20-2017
16
Posts
0
Kudos Received
0
Solutions
08-10-2017
12:02 PM
thank you, thats perfect
... View more
08-10-2017
09:09 AM
Hi, i have a dataflow set up that directs info to an executestreamcommand processor. I would like to be able to view the output in real time through a terminal or url if thats possible, but i dont have much experience in this. Maybe netcat or something. Is this possible? Thank you
... View more
Labels:
- Labels:
-
Apache NiFi
07-11-2017
02:21 PM
Hi Matt. Yes that was the issue, thank you for your help.
... View more
07-11-2017
01:34 PM
Hi @Matt Clarke
Thanks for the reply. Even when i use the getfile processor, it seems to hang on the split text processor. The data seems to queue before the splittext processor. There is a header in the csv file so header line count is set to 1 in the processor. Is there something else i am missing. The csv file is 90mb with 32 columns. Thanks
... View more
07-07-2017
01:11 PM
Hi. I have a csv file and a libsvm file i would like to take line by line and add into a dataflow. I have added the tailfile to a control rate processor and the a split text processor. I am not recieving errors but nothing is moving. Does Tailfile work on these filetypes. Thank you
... View more
03-07-2017
08:49 PM
Hi @Matt Clarke. Perfect, thank you
... View more
03-06-2017
02:32 PM
ok, thank you
... View more
03-06-2017
12:21 PM
Hi @Michal R, sorry i should have been more clear, I use the MergeContent processor to merge the files. I want to know if there is a way to only forward the merged content if there are 50 files in it over a certain time. So if 50 files are merged with 1 minute then forward them, if not, drop them. I need to edit my question, sorry again.
... View more
03-06-2017
09:42 AM
Hi, i am trying to create a flow that forwards merged content only if there are 50 flowfiles in it over 1 minute. If there are less than 50, i want to drop the content. Is this possible without a custom processor? Thank you
... View more
Labels:
- Labels:
-
Apache NiFi
02-22-2017
10:04 AM
Perfect, thank you. I had something similar, but i wasnt sure if it was splitting the data in some way. Thank you
... View more