Member since
12-14-2017
9
Posts
0
Kudos Received
0
Solutions
01-09-2018
09:21 PM
1 Kudo
@Paresh Baldaniya There are already processors designed to make systems calls. ExecuteProcess -- Runs an operating system command specified by the user and writes the output of that command to a FlowFile. ExecuteStreamCommand -- Executes an external command on the contents of a flow file, and creates a new flow file with the results of the command. ExecuteScript -- Executes a script given the flow file and a process session. I would think the ExecuteProcess would be the best for your use case.
... View more
12-28-2017
06:41 PM
3 Kudos
@Paresh Baldaniya We cannot change the existing filename in HDFS but we can do alternate solution in NiFi as below. You need to use UpdateAttribute processor before PutParquet processor. Update Attribute we are going to update the filename before putparquet processor so that everytime when file goes to putparquet processor will have same file name every time. Add new property to Update attribute processor filename
desired_parquet_filename.prq Configs:- PutParquet:- So we are going to have same filename everytime and in this processor we need to change Overwrite Files property to True //if the same filename exists in the directory processor will replace the existing file with new file Configs:- Flow:- 1.GetFile
2.UpdateAttribute //change the filename by adding filename property
3.PutParquet //change the Overwrite files property to true If the Answer helped to resolve your issue, Click on Accept button below to accept the answer, That would be great help to Community users to find solution quickly for these kind of errors.
... View more
06-26-2018
11:51 AM
Hi @Matt Burgess, how can we do thos for a csv file..? for example i am having a csv file(studentid_name_city.csv). I want to break the filename and put it into different columns of Studend table(Studentid,name,city).
... View more
12-20-2017
08:32 AM
@Benjamin Hopp Thanks for your Reply
... View more
12-19-2017
08:47 PM
So if I am understanding this correctly, you want to do the following: Windows Files Share ----> NiFi -----> Hadoop Off the top of my head I can think of a couple ways to do it. 1) Setup the Windows directory to share via FTP. This can be done using IIS on the Windows machine, or a 3rd party FTP server 2) Install MiNiFi or NiFi on the Windows machine to transmit data using site-to-site protocol to the NiFi Cluster Is there a particular reason you don't want to mount the share to the NiFi host?
... View more