Member since
07-29-2020
574
Posts
323
Kudos Received
176
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
1976 | 12-20-2024 05:49 AM | |
2202 | 12-19-2024 08:33 PM | |
2022 | 12-19-2024 06:48 AM | |
1324 | 12-17-2024 12:56 PM | |
1899 | 12-16-2024 04:38 AM |
05-08-2023
05:29 AM
I have used the below spec for moving some of the resource characterstic to top level but iam getting null values. [ { "operation": "shift", "spec": { "header": { "timeStamp": "records.inv_activity_ts", "activityId": "records.inv_activity_id", "action": "records.action" }, "resource": { "drniId": "records.inv_id", "subtype": "records.inv_subtype", "name": "records.inv_name", "resourceCharacteristic": { "*": { "name": { "matchingkey": "records.matchingkey-value", "status": "records.status-value" }, "value": { "matchingkey": "records.matchingkey-value", "status": "records.status-value" } } } } } } ] Output iam getting { "records" : { "inv_activity_ts" : "1670484663189", "inv_activity_id" : "256388257993155783", "action" : "create", "inv_id" : "256383859946641699", "inv_subtype" : "Backplane Connection", "inv_name" : "Backplane Connection", "matchingkey-value" : null, "status-value" : null } } What is mistake iam making?.
... View more
05-07-2023
05:05 PM
1 Kudo
@ThienTrinh, As workaround you can follow this scenario: https://stackoverflow.com/questions/55118389/i-have-two-json-payload-i-want-to-merge-them-in-a-single-json-object/55124212#55124212
... View more
05-05-2023
01:39 PM
2 Kudos
Hi, One of the main differences that I can see is that the QueryDatabaseTableRecord has a RecordWriter which allows you to to decide the format of the output (Json, xml, csv , parquet ...etc.) where a service need to be setup for the record writer depending on the format, while the QueryDatabaseTable will only provide an Avro format output without the need to setup any record writer service. This is similar to the case of processors ExecuteSQL vs ExecuteSQLRecord. Another important difference I see is the QueryDatabaseTable has property setting for "Transaction Isolation Level" while the other doesn't If that helps please accept solution. Thanks
... View more
05-04-2023
10:55 AM
Hi @SAMSAL , thanks for your response. You are right, changing to "Entire Text" worked I was in a hurry and didnt try changing the evaluation mode. I supposed that "Always Replace" would do the work. Thank you
... View more
05-04-2023
12:33 AM
1 Kudo
@danielhg1285, While the solution provided by @SAMSAL seems to be better for you and more production ready, you could also try the below things. This might work if you are using a stable statement all the time and if are not restricted to see the exact INSERT Statement but rather see the values trying to be inserted. - Shortly after RetryFlowFile, you can add an AttributesToJSON processor and manually define all the columns which you want to insert in the Attributes List Property. Make sure that you use the attribute name from your FlowFile (sql.args.N.value) in your correct order and you set Destination = flowfile-content. In this way, you will generate a JSON File with all the columns and all the values which you have tried to insert but failed. - After AttributesToJSON, you can keep your PutFile to save your file locally on your machine, hence opening it whenever and wherever you want 🙂 PS: This is maybe not the best solution, due to the following reasons, but it will get you started on your track: - You will need to know how many columns you have to insert and each time a new column will be added you will have to modify your AttributesToJSON processor. - You will not get the exact SQL INSERT/UPDATE Statement, but a JSON File containing the column-value pair, which can easily be analyzed by anybody.
... View more
04-24-2023
09:37 AM
@wolfsilver00 Has the reply helped resolve your issue? If so, please mark the appropriate reply as the solution, as it will make it easier for others to find the answer in the future.
... View more
04-22-2023
10:57 AM
Actually I want to extract all three or there can be more such like n options, So I need to extract all these n options, I explain it a bit wrong Sorry for that. All of them will have some values and I need to extract that one by one. I cant edit the question now, its not allowing. {
"status": "pass",
"values": {
"WorkFlow": 0,
"Reasons": {
"resID": "",
"options": {
"9876": [
"t1",
"t2"
],
"9875":[ "t1",
"c2"],
"9874":[ "x1",
"a2"]
}
}
}
} There can N number of options, not only these 3 (9876,9875,9874) and I have to extract all of those options because I have to run API using on all these values one by one thats why I have to extract them one by one and store them in some variable. Flow is like : First option(9876) will be extracted in some variable -> we will check if it is empty or not -> if not empty -> it will call an API using this(9876) option -> same will happen for all the option one by one. But I am unable to fetch these options one by one.
... View more
04-19-2023
03:16 AM
Thank you sir @SAMSAL . I am very grateful, at the moment, your solution does the job. Also, i would love to learn JoltTransformation from you. Thanks once again.
... View more
04-17-2023
02:29 PM
Thank you @SAMSAL. I can always count on you to respond with good ideas. It didn't occur to me to include just a single parameter in the math function because the documentation shows using "scalb" and "pow" methods with the second parameter as the value to pass to the input method. But also, I was trying to use a Double because they documentation says "toRadians" only accepts a double, and the NiFi expression language docs shows using a "toDouble()", but "toDouble" doesn't seem to be supported. Seems to be an inconsistency in the docs. Your suggestions were pretty close. Below is the final result which did the trick. ${latitude:toDecimal():math("toRadians")} Thank you again!
... View more