Member since
06-08-2017
1049
Posts
518
Kudos Received
312
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
9981 | 04-15-2020 05:01 PM | |
5994 | 10-15-2019 08:12 PM | |
2444 | 10-12-2019 08:29 PM | |
9666 | 09-21-2019 10:04 AM | |
3540 | 09-19-2019 07:11 AM |
10-17-2017
01:51 AM
Hi @xav webmaster, i think this answer will help you for sure, Sample Flow :- GenerateFlowFile--> ExtractText-->UpdateAttribute-->PublishKafka GenerateFlowFile:- As for testing purpose i'm using this but in your case you are having some other processors ExtractText Processor:- in this processor i'm extracting the contents of flowfile as attribute. Ex:- adult,dog,bulldog,9,23,male,brown,4,etc The above content of flowfile by adding new property to ExtractText i'm going to extract the content and keeping that as attribute of the flowfile cnt_attr as (.*) //capture everything and add to ff as cnt_attr Configs:- Output of this processor:- Every flowfile will associated with the attribute called cnt_attr to it, we can use this attribute in UpdateAttribute Processor:- To dynamically change the topic names based on the cnt_attr attribute, for this case we need to use Advanced Usage of Update Attribute processor. Right Click on UpdateAttribute processor and click on Advanced Button in the lower right corner. Steps:- open above screenshot in new tab to see 1,2,3,4 steps and refer them with below steps 1. As mentioned in the above screenshot click on FlowFile Policy change to UseOriginal 2. Click on + sign at Rules and give name as adult_dog 3. Click on + sign at Conditions and give our check condition in it ${cnt_attrt:matches('.*adult.*dog.*')} 4. Click on + sign at Actions and give the attribute name as kafka_topic Value for the kafka_topic attribute as adult_dog New Rule:- for cat_dog conditions check is ${cnt_attr:matches('.*cat.*dog.*')} and Actions add attribute name as kafka_topic and value is cat_dog same as 2,3,4 steps above. summarize all the steps:- step 1 we are using original flowfile and
step2 we are creating a rule and
step3 adding conditions to check if the cnt_attr attribute satisfies or not
step4 if it satisfies then adding kafka_topic attribute with desired name to it. like this way we can add as many rules as we want in same UpdateAttribute Processor as you can see in my screenshot i have added 2 Rules(adult_dog,cat_dog). This processor checks which Rule has satisfied and updates kafka_topic attribute with the mentioned name in it. PublishKafka:- use the kafka_topic attribute in Topic Name property of processor ${kafka_topic} Flow Screenshot:- In this way we can use only one UpdateAttribute to dynamically change the value of kafka_topic based on update attribute processor and use same kafka_topic attribute to publish messages to respective topics.
... View more
04-24-2019
06:38 AM
@Shu can you please guide, how to trigger the rest api command scripts in NiFi which you have suggested in this post
... View more
10-15-2017
07:45 PM
That's a great answer. Thank you so much for the help. It'll really come in handy. I can see that he has taken time to respond, and I thank him for that.
... View more
03-06-2018
05:50 PM
@Shu How is number of Mappers/reducers decided for a given query will be decided in runtime ? Is it dependet on how many number of Joins or group by or order by clauses that are used in the query ? If yes, then please let me know how many mappers and reducers are launched for the below query. select name, count(*) as cnt from test group by name order by name;
... View more
09-20-2018
11:36 PM
@satyadevi jagata change the below property value in EvaluateXQuery processor to Destination
flowfile-attribute Then try to re run the processor. If the above property value set to flowfile-content then Processor doesn't allow more than one query to be added. If the issue still doesn't resolved, please open a new question for more visibility to the community and Add all the details what you have tried so far and sample data to reproduce the same issue.
... View more
10-22-2017
03:51 AM
1 Kudo
Hi @Mohan Sure, We can get results as you expected by using EvaluateXquery //we can keep all the required contents as attributes of flowfile.
UpdateAttribute //update the contents of attributes that got extracted in evaluatexquery processor.
ReplaceText //replace the flowfile content with attributes of flowfile
PutHDFS //store files into HDFS EvaluateXquery Configurations:- Change the existing properties 1.Destination to flowfile-attribute 2.Output: Omit XML Declaration to true Add new properties by clicking + sign 1.author //author 2.book //book 3.bookstore //bookstore
Input:- <?xml version="1.0"?>
<?xml-stylesheet type="text/xsl" href="myfile.xsl" ?>
<bookstore specialty="novel">
<book style="autobiography">
<author>
<first-name>Joe</first-name>
<last-name>Bob</last-name>
<award>Trenton Literary Review Honorable Mention</award>
</author>
<price>12</price>
</book>
</bookstore> Output:- As you can see in screenshot all the content are as attributes(book,bookstore,author) to the flowfile. EvaluateXquery Processor configs screenshot:- Update Attribute Processor:- 1.author ${author:replaceAll('<author>([\s\S]+.*)<\/author>','$1')} updating the author attribute input to updateattribute processor:- <author> <first-name>Joe</first-name> <last-name>Bob</last-name> <award>Trenton Literary Review Honorable Mention</award> </author> Output:- <first-name>Joe</first-name> <last-name>Bob</last-name> <award>Trenton Literary Review Honorable Mention</award> 2.book ${book:replaceAll('<book\s(.*)>[\s\S]+<\/author>([\s\S]+)<\/book>','$1$2')} Input:- <book style="autobiography"> <author> <first-name>Joe</first-name> <last-name>Bob</last-name> <award>Trenton Literary Review Honorable Mention</award> </author> <price>12</price> </book> Output:- style="autobiography" <price>12</price> 3.bookstore ${bookstore:replaceAll('.*<bookstore\s(.*?)>[\s\S]+.*','$1')} Input:- <bookstore specialty="novel"> <book style="autobiography"> <author> <first-name>Joe</first-name> <last-name>Bob</last-name> <award>Trenton Literary Review Honorable Mention</award> </author> <price>12</price> </book> </bookstore> Output:- specialty="novel"
Configs:- ReplaceText Processor:- Cchange the properties of Replacement Strategy to alwaysreplace and use your attributes bookstore,book,author in this processor and we are going to overwrite the existing contents of flowfile with the new content. add 2 more replacetext processors for book and author attributes. Output:- <first-name>Joe</first-name>
<last-name>Bob</last-name>
<award>Trenton Literary Review Honorable Mention</award> PutHDFS processor:- Configure the processor and give the directory name where you want to store the data. Flow Screenshot:- For testing purpose i have use generate flowfile processor but in your case generate flowfile processor will be the source processor from where you are getting this xml data.
... View more
10-12-2017
01:13 PM
@Simon Jespersen You can use UpdateAttribute and create a new attribute with the following EL : ${filename:getDelimitedField(3,'_')}
... View more
04-22-2019
11:58 AM
could you please share the nifi template for this. I am not able to achieve this.
... View more
10-12-2017
06:05 PM
@Eric Lloyd, for this case make we cannot mention wild cards like [*] as this processor wont accepting those regex. change FilestoTail property to test[1-2]/[\d|a-z.*]{1,}/test.log Expression Explanation:- test[1-2] --look for test1 or test2 Then property expression check for [\d|a-z.*]{1,} --check for directory name as having digits or letters more than one time and lists all those directories recursively. Configs:- For your case:- Files toTail property should be some thing like below versions/[\d|a-z.*]{1,}/<your-log-file-name>
... View more