Member since
06-24-2021
7
Posts
0
Kudos Received
0
Solutions
11-08-2022
12:44 PM
Hi, I'm only able to do that via two processors: 1- JotlTransfomJSON : this is to add desired attributes (attr_id & attr_name) to the flowfile json using the following spec: [
{
"operation": "shift",
"spec": {
"#${attr_id}":"attr_id",
"#${attr_name}":"attr_name",
"model": "model",
"cars_drivers": {
"*": {
"id": "cars_drivers[#2].id",
"name": "cars_drivers[#2].name",
"is_processed": "cars_drivers[#2].is_processed"
}
}
}
}
] 2 - UpdateRecord Processor: Once the attributes are added to the Json , you can update those records Is_processed value only when the id matches the attr_id and name matches the attr_name. To do that set the following properties: a- Replacement Value Strategy: Literal Value b- /cars_drivers[*]/id[.=../../../attr_id]/../name[.=../../../attr_name]/../is_processed : 1 Hope that helps, if it does please accept solution.
... View more
08-10-2022
12:55 PM
2 Kudos
Hi, you can use ReplaceText by setting the Evaluation Mode property to "Line-by-Line", then set the "Line-by-Line Evalaution Mode" to "Last Line". In the Replacement Value set it to Empty String Set and the Search Value leave as "(?s)(^.*$)" as seen in the screenshot below: If you find this helpful please accept solution. Thanks
... View more
07-31-2022
06:21 PM
@KhASQ , Besides @SAMSAL solution, you can also use ReplaceText to eliminate the need of extracting the entire content as an attribute. You'd still have to set a large enough buffer, though, to ensure your largest message could be processed. Cheers, André
... View more
08-31-2021
01:58 PM
Hi Apache spark will initiate connection to your db on that port only via jdbc , so you can open a firewall where sources are your nodes ips and destination is your db server ip on the port you specified. Best Regards
... View more
06-28-2021
02:59 AM
Hi @KhASQ For Watermarking use any framework/db to update values once job is successfully. If you are using kafka then kafka itself you can store kafka related watermarking. Other than kafka you want to use then choose any RDBMS or HBase table.
... View more