Member since
06-09-2016
48
Posts
10
Kudos Received
0
Solutions
08-10-2016
04:20 PM
I am trying to update the properties of a routeText processor using the rest api using the syntax given below. curl -i -X PUT -H 'Content-Type: application/json' -d '{"revision":{"clientId":"7a1f42ec-f805-4869-a47c-27306a38490a"},"processor":{"id":"5a93362a-482a-42d0-9ef6-f965a08202eb","config":{"properties":{"Search Value":"abcd"}}}}' http://localhost:8080/nifi-api/controller/process-groups/root/processors/5a93362a-482a-42d0-9ef6-f965a08202eb Although I am able to update properties like "Replacement Values" using the syntax given above but unable to update the "Search Value" property.When I execute it is trying to create a dynamic property named "Search Value" instead of updating the updating the original "Search Value" property provided by ReplaceText processor.Is there any problem specific to this property?Has anyone faced this problem before?
... View more
Labels:
- Labels:
-
Apache NiFi
08-08-2016
06:13 AM
Thanks for your inputs. It is working fine by using the "root" as the processor group. and using the version properly.
... View more
08-07-2016
07:19 AM
Great article very informative.But in my situation if there are multiple target directories to be created in runtime. Is it possible to generate using a single PutFile processor?How do I handle that/create multiple putFile processor at runtime?
... View more
08-03-2016
04:18 PM
I am starting from the controller and then going to the processor since there are no process groups but this URL is throwing error as mentioned above.Could you please site an example because I am following the step you mentioned.Is there anything wrong with this URL?
... View more
08-03-2016
01:51 PM
1 Kudo
I have created a simple data flow(above) using the NIFI GUI but I need to execute the same via the NIFI REST API.I am trying to execute the processor "GetFileTask" using the command below: curl -i -X PUT -H 'Content-Type: application/json' -d '{"revision":{"clientId":"b7aff2f1-fb0e-4abd-90f4-c64d68989a5b"},"processors":{"id":"9dbceb31-9715-40b7-83c4-118cb5bf7a64","running":"true"}}' http://localhost/nifi-api/controller/processors/b5a9293b-b9bd-4258-b9fb-93607843a327 I am a bit confused about how to generate the URL we are using in the command above i.e http://localhost/nifi-api/controller/processors/b5a9293b-b9bd-4258-b9fb-93607843a327 Is it a web service?When I create the job in GUI is it already created or do I need to create it separately? Please provide some inputs on how to frame this URL ?When I am trying to access the URL it is showing "Resource not Found".
... View more
Labels:
- Labels:
-
Apache NiFi
08-01-2016
01:46 PM
To load the conditions into a DistributedMapCache do I need to do any coding?If yes in which language? Can you provide a link of working example of DistributedMapCache? Actually I don't have any experience in any of the programming languages you mentioned so I wanted to know if the first option you mentioned would require any programming?
... View more
08-01-2016
12:50 PM
1 Kudo
I have a requirement where I have a input text file and I have to route the data to different directories based on some filter on the data values using NIFI.But the challenge is the condition will be provided at run time and I have to read the condition from a config file. Is there any option/processor in NIFI to achieve this requirement?
... View more
Labels:
- Labels:
-
Apache Hadoop
-
Apache NiFi
07-27-2016
11:45 AM
Could you explain what do you mean by state mechanism.How I need to implement that in producer or before the data is read by the producer?
... View more
07-27-2016
09:54 AM
How to read only the incremental data using NIFI?I mean the creation of files in the folder will be a continuous process but once I already moved a file I don't want to move the same file next time around.Will that be possible in NIFI? Coming back to Kafka if I write custom producer will it be possible to move only incremental data or the new files only because the file creation will be a continuous process.Also could be mention the class to look for if I want to write custom producer to read from a path instead of standard input steam?
... View more
07-27-2016
09:17 AM
I have a use case where I need to read files from a folder in Unix and write the data into Hadoop File System. Files will be generated in the folder by a downstream process real-time. Once a file has been generated the data should be moved into Hadoop. I am using Apache Kafka for the process. I need to know how to implement this use case.
How to read only the newly created files from the folder using the Kafka producer?(Any examples/Java Classes to use) How to write the consumer to write the files into Hadoop File system?(Any examples/Java Classes to use) Is there any other technology like NIFI /Apache storm I need to use along with Kafka to obtain the results or this can be implemented entirely using Kafka?
... View more
Labels:
- « Previous
- Next »