Member since
06-26-2015
509
Posts
136
Kudos Received
114
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
1332 | 09-20-2022 03:33 PM | |
3868 | 09-19-2022 04:47 PM | |
2274 | 09-11-2022 05:01 PM | |
2371 | 09-06-2022 02:23 PM | |
3727 | 09-06-2022 04:30 AM |
08-28-2022
03:45 PM
@ramesh0430 , What's the throughput that you are expecting? What's your processor configurations? Cheers, André
... View more
08-28-2022
03:40 PM
@belka , No, Apache Druid is not currently supported. Cheers, André
... View more
08-26-2022
07:27 PM
@FozzieN , You can do exactly what you're trying to achieve using standard NiFi processors and a schema to describe your data structure. Please see attached a flow example that replicates what you are doing without the need for scripting. This flow uses the following schema, set as a parameter: {
"type": "record",
"name": "MyData",
"fields": [
{ "name": "source_timestamp", "type": "string" },
{ "name": "signal", "type": "string" },
{ "name": "value", "type": "string" },
{ "name": "fecha_envio", "type": ["null", "string"], "default": null }
]
} Cheers, André
... View more
08-25-2022
05:07 PM
@Arash , It's likely that your flow is running on multiple nodes and each node is trying to process every file. When different nodes try to write the same file at the same time you get these errors. You can change your flow to use a List-Fetch pattern. The ListHDFS processor should run only on the primary node and the output of it (the list of files) can be load-balanced across the nodes so that each node will FetchHDFS different files. Cheers, André
... View more
08-21-2022
10:06 PM
@chitrarthasur, Has the reply helped resolve your issue? If so, please mark the appropriate reply as the solution, as it will make it easier for others to find the answer in the future.
... View more
08-21-2022
12:52 PM
@SandeepSingh Hue user has Dba privilage however i got the same issue, anyone can support? which privilage exactly you mean?
... View more
08-16-2022
11:02 PM
@ho_ddeok, Has any of the replies helped resolve your issue? If so, can you please mark the appropriate reply as the solution, as it will make it easier for others to find the answer in the future?
... View more
08-16-2022
04:47 PM
Thanks @araujo I am using server side calls to get the information and then pass it on through to the client side. With this approach, all API calls are made from the server side component.
... View more
08-16-2022
06:34 AM
Hi @noekmc , Are you referring to the below "Restart service" highlighted option that you see under ldap_url? If yes, it is expected and you can refer my earlier comment.
... View more
08-14-2022
03:58 PM
1 Kudo
@totokogure , The EvaluateJsonPath that you added is extracting the $.hits array and storing it as an attribute. The next processor (SplitJson), does not even use that attribute, though. It extracts $.hits again from the flowfile content. The EvaluateJsonPath in this flow should be unnecessary. If you connected InvokeHTTP to SplitJson, things should work correctly. Cheers, André
... View more