Member since
07-29-2020
574
Posts
320
Kudos Received
175
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
244 | 12-20-2024 05:49 AM | |
280 | 12-19-2024 08:33 PM | |
290 | 12-19-2024 06:48 AM | |
242 | 12-17-2024 12:56 PM | |
233 | 12-16-2024 04:38 AM |
09-03-2024
12:36 AM
3 Kudos
I think you are confusing the ExtractText and the ReplaceText proessors. The ExtractText doesn't have Search Value & Replacement Value properties but the ReplaceText does. That is why I said post screenshot would be helpful because have I known that its replace Text my answer would have been different. To get the desired result in this case , you need to specify the following pattern in the Search Value Property: ^(.{5})(.{10}).* Basically you need to specify the full lline text that you want to replace with the matched group. When you stopped at "^(.{5})(.{10})" it meant that you only want to replace up to the 15th character of the full text with the result $1,$2 and that is why you were getting the reminder of the text. By adding ".*" at the end it will replace the whole line and not just up to the 15th character. The final config will look like this I hope that makes sense.
... View more
09-03-2024
12:09 AM
1 Kudo
Hi,
My Apologies. I think I forgot to mention that in both cases you need to set the Timestamp Format in the CSVRecordWriter to the target format as follows since by default it converts the datetime to epoch time:
The point from the conversion in the QueryRecord is to tell the CSVReader that this is a datetime , however without setting the format in the writer it was converting it back to epoch time as the documentation states:
Setting the format there is critical to get the desired output.
Hope that helps.
... View more
09-02-2024
04:53 PM
2 Kudos
Hi, It could have been helpful if you were able to provide some examples regarding the different scenarios with what is expected vs what are you getting. Also providing screenshot of the processor\s in question can help making sure that you have the correct configuration to handle your case. One thing confusing to me is you dont mention anything about white spaces and if they count as a character in case of the name or the address or not. Going with what you provided, if we assume we have the following line: smithaddress123AAAA where name expected to be: smith (1-5) address: address123 (6-15) I have configured the ExtractAddress processor as follows (basically adding new dynamic properties to define the extracted attributes): The output flowfile will have the following attribute which what is expected: The reason on why you are getting additional attributes with an index is because how the processor works in breaking up matching group. You can read more about this here. If you find this helpful please accept the solution. Thanks
... View more
08-31-2024
10:19 AM
1 Kudo
Hi @NagendraKumar , Im not sure that you can use the function "DATE_FROM_UNIX_DATE" since according to the sqlcalcite documentation its not a standard function. If I may recommend two approaches to solve this problem: 1- Using Sql Calcite function TIMESTAMPADD: select TIMESTAMPADD(SECOND, 1724851471,cast('1970-01-01 00:00:00' as timestamp)) mytimestamp from flowfile 2- Using Expression Language: select '${literal('1724851471'):multiply(1000):format('yyyy-MM-dd HH:mm:ss')}' mytimestamp from flowfile In both cases you have to be aware of the timezone that the timestamp is converted into I think one uses local while the other uses GMT Hope that helps. If it helps please accept the solution. Thanks
... View more
08-30-2024
06:23 AM
1 Kudo
Hi @xtd , I did not try this but I think what is happening is that when you call the PutDatabaseRecord using CSVReader it doesnt differentiate null and empty string value and it will assume the later is also null and that is why you get the error. Have you tried converting csv into json and then do update record to see if that makes any difference? Another option is to use PutSql so that you can write your own sql but this would be cumbersome in case you have many fields.
... View more
08-30-2024
05:54 AM
2 Kudos
Hi @moshell281 , @steven-matisonis right, when you use infer schema the processor will map the value as it thinks the data type should be, so if it sees an integer it will map to an integer and so on. To force certain datatypes you have to define your own Avro schema and feed it to the Json Record Set Writer since this is your target format. If you are not comfortable working with Avro schema then you can use UpdateRecord or even JsonJoltTransformation processors to cast the value into the proper type. Hope you find this helpful.
... View more
08-27-2024
05:28 AM
Hi, Its not clear from your description what is the input json format and what is the expected output format, but from what I was able to understand from your description you have the following json: {
"sessionStopTime": "09/07/2024 06:47:19",
"resultCode": [
"2001",
"2001"
],
"recordOpeningTime": "09/07/2024 06:44:22"
} And you want to produce new attribute for the second array element of resutltCode to be stored in another field like this even though in your case both array elements are the same: {
"sessionStopTime": "09/07/2024 06:47:19",
"resultCode": [
"2001",
"2001"
],
"resultcodemscc":"2001",
"recordOpeningTime": "09/07/2024 06:44:22"
} If this the case then Jolt spec would look like this: [
{
"operation": "modify-overwrite-beta",
"spec": {
"resultcodemscc": "=elementAt(@(1,resultCode),1)"
}
}
] If that helps please accept solution. Thanks
... View more
08-26-2024
04:39 PM
2 Kudos
Hi @NagendraKumar , I think you misunderstood how the Stateful UpdateAtrribute works which is OK since a lot of people would think this way too if you have not used before which happened to me as well :). Basically when you make an UpdateAttribute Stateful by setting the Store State Property, as the value says it will "Store state locally" meaning you only can access the previous state of a given attribute within the processor itself. As I can see, you are trying to store the Batch Id in the Second Update Attribute and then access it from the first UpdateAttribute and that is why you are getting empty string because it doesnt exist there. You dont need two UpdateAttributes to manage that and one should do the job. Lets assume we have the following flow which consist of the following processors: 1- GenerateFlowFile: This will simulate setting new BatchId attribute by adding dynamic property BatchID and set it to some value. 2- UpdateAttribute: This will be a stateful and it will have two attributes: One to get the last saved BatchId value and another to set the last saved batch ID to the Current. 3- RouteOnAttribute: This is basically where you compare previous to current and route accordingly. Here is how the flow looks like Here is the config for each processor: GenerateFlowFile: UpdateAttribute: RouteOnAttribute: Basically, if you run once for the first time you will get the flowfile routed to the unmatched relationship of the RouteOnAttribute (since no previous value was set), however if you run it again without changing anything the result will be routed to the Match relationship since the previous saved value will equal the new one. Change the value in the GenerateFlowFile and it will go to unmatched and so on. I know you probably wondering how this works since in the UpdateAttribute Im referencing the LastSavedStateBatchID while at the same time its being set to the CurrentBatchID?! which comes first ? well the answer is simple: If you refer to the stateful documentation where its talking about you will find the following line: " If stateful properties reference other stateful properties then the value for the other stateful properties will be an iteration behind" . Which means PreviousBatchID will be set to the lastSavedStateBatchID before resetting the later to the current . if that makes any sense 🙂 There is a cleaner way of doing it which help eliminate this confusion and the circular reference by defining Rules under the Advanced feature which you can play with but I feel this way is much shorter. Hope that helps, if it does please accept solution. Thanks
... View more
08-26-2024
01:35 AM
1 Kudo
Hi, I was going to recommend the Update Attribute since it has the ability to store state. Can you please share screenshots of how it was configured and where did it return blank?
... View more
08-23-2024
09:39 PM
2 Kudos
Hi, Can you provide screen shot of how each processor is being configured? Also for the PutDatabaseRecord what format are you using for the Reader services and are you providing defined schema or are you using default infer schema? what I suspect is happening is that if you are using Infer schema in the record reader its converting the value into integer therefore the leading zero will be omitted. In this case you need to provide the avro schema where the field holding this value is assign a string data type. If that helps please accept the solution. Thanks
... View more