Member since
07-29-2020
574
Posts
321
Kudos Received
175
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
638 | 12-20-2024 05:49 AM | |
751 | 12-19-2024 08:33 PM | |
684 | 12-19-2024 06:48 AM | |
529 | 12-17-2024 12:56 PM | |
590 | 12-16-2024 04:38 AM |
06-16-2023
07:19 AM
Hi @Ray82 Its hard to predict how this is going to work without understanding the full data flow you have, so I had to come up with simple flow to test the query which worked for me as expected. Please see the my flow below and each processor configuration to get the expected result: Data Flow: 1) Generate Input Josn: GenerateFlowFile processor to produce the Input json as you provided and provide the query attributes. 2) QueryRecord: Run the Query provided above to generate expected result: Notice that you need to connect the QueryRelationship "QueryRel" to downstream processor to get the expected output and not the original relationship: After running the DataFlow above and getting the output in the QueryRel relationship queue , the output looks like as follows: Notice to get the out put as an array the JsonRecordWrite in the QueryRecord RecordWriter is configured as the following: Like I said in the beginning everything depends at the end on your overall dataflow and how you are getting the data , attributes and in what format. The details above hopefully can guide you in case you are missing something or figure out what is making this not working in your case. Also be aware that Im using version 1.20.0 so also depending what version you are using the behavior can be different. Hope that helps.
... View more
06-16-2023
06:09 AM
Hi, The path you specified seems to be correct according to the XML you provided. Since you did not provide the full XML I was only able to test it on the following: <?xml version="1.0" encoding="UTF-8"?>
<SalesOrder>
<MessageProcessingInformation>
<SendingSystemID>SMS_0175</SendingSystemID>
</MessageProcessingInformation>
<Header>
<SalesDocumentType>YPO0</SalesDocumentType>
<SalesOrganization>7002</SalesOrganization>
<DistributionChannel>00</DistributionChannel>
<RequestedDeliveryDate>2023-06-15</RequestedDeliveryDate>
<CustomerPurchaseOrderDate>2023-06-15</CustomerPurchaseOrderDate>
<Incoterms1>EXW</Incoterms1>
<Incoterms2>Ex Works</Incoterms2>
<PaymentTerms>E007</PaymentTerms>
<CustomerPurchaseOrderNumber>990831 </CustomerPurchaseOrderNumber>
<ReferenceDocumentNumber>SI-000000614</ReferenceDocumentNumber>
<ServicesRenderedDate>2023-06-15</ServicesRenderedDate>
<LogisticData>
<SourceReferenceDocumentNumber>SI-000000614</SourceReferenceDocumentNumber>
<SourceSalesOrderNumber>SO-000050562</SourceSalesOrderNumber>
</LogisticData>
</Header>
</SalesOrder> and I was able to get the ReferenceDocumentNumber using the Path: /SalesOrder/Header/ReferenceDocumentNumber Notice I had to remove the following line because I was getting Parsing Error: <ns1:SalesOrderMessages mlns:ns1="http://corpintra.net/pi/CBFC_GLOBAL_SAP_APPL/SalesOrder" xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/"> What are you seeing exactly? Are you seeing an error or the value is not populated in the FileName attribute ? Can you please provide more info on that? Thanks
... View more
06-15-2023
03:09 PM
Update: Actually, I just found out that you can achieve what you need using QueryRecord processor by defining the query Relation Property with the following query: select reference,barcode,"date",
case when barcode = '${Barcode_Attribute}' then '${Shipment_Number_Attribute}'
else shipment_number end as shipment_number
from flowfile The Json Jolt transformation above still works but its more complicated and probably less efficient. Hope that helps
... View more
06-15-2023
02:27 PM
Hi, I dont think what you are trying to do can be achieved with UpdateRecord, the reason being that the UpdateRecord you have to define a record path where the specific record\attribute have to be static and cannot take any variable like a flowfile attribute. For example lets say you want to update record with barcode "C337287V28490011" , then record path property would be defined as the following which would work : /*[/barcode='C337287V28490012']/shipment_number However, you cannot define the record path like this: /*[/barcode='${Barcode_Attribute}']/shipment_number The only way I can think of where you can accomplish this in one processor is to use the JoltTransformationJSON processor. Jolt is a transformation language for json and its very powerful. In this processor you set the "Jolt Specification" property with the following spec: [
{
"operation": "shift",
"spec": {
"*": {
"barcode": {
"${Barcode_Attribute}": {
"$": "[&3].barcode",
"@(2,date)": "[&3].date",
"#${Shipment_Number_Attribute}": "[&3].shipment_number",
"@(2,reference)": "[&3].reference"
},
"*": {
"$": "[&3].barcode",
"@(2,date)": "[&3].date",
"@(2,shipment_number)": "[&3].shipment_number",
"@(2,reference)": "[&3].reference"
}
}
}
}
}
] Notice how you can use Expression Language in the Jolt Spec where you can specify flowfile attributes as well. For more information on Jolt Spec you can refer to the following: https://jolt-demo.appspot.com/#inception https://intercom.help/godigibee/en/articles/4044359-transformer-getting-to-know-jolt If that helps please accept solution, Thanks
... View more
06-13-2023
06:47 AM
1 Kudo
Hi, I think what you need is the GetFile processor and not FetchFile. In the Fetch File you have to specify the exact file path and name that you want to fetch and you can't use wild card. However in the Get File you specify the path in the "Input Directory" then you can use Regex in the "File Filter" property to capture files with certain pattern. In your case the search pattern in Regex will be something like ".*xyz.*\.txt" You need to watch for other Properties like "Keep Source File" which decide what to do with the file after its picked up. If you want to loop indefinitely then "False" would be the right value. Once the file is picked up you can save it somewhere else using PutFile processor. The "Recurse Subdirectories" let you decide wither you want to search root folder only or all folder levels. If that helps please accept solution. Thanks
... View more
06-06-2023
06:24 AM
2 Kudos
Hi, Nifi QueryRecord uses Apache Calcite SQL. Per the documentation (https://calcite.apache.org/docs/reference.html ) you can use either CHAR_LENGTH or CHARACTER_LENGTH to get the string size. Both functions seem to have the same behavior according to the documentation. If you find this helpful please accept solution. Thanks
... View more
05-26-2023
07:38 AM
1 Kudo
Hi, The request Body has to be provided as flowfile to the InvokeHttp Processor per the processor description: You can use an upstream processor to the InvokeHttp like ReplaceText processor to generate the request body a follows: To send incoming flowfile content as Request Body , make sure the following property of the invokehttp processor is set to true (default): To make sure you are always getting a response no matter if its successful or not make sure to set the following invokehttp processor to true (default false): If that helps please accept solution. Thanks
... View more
05-26-2023
07:17 AM
Hi, I think you can achieve this in two shift transformation as follows: [
{
// 1st transformation is basically to isolate
// "OC" value reference into Orders.ValueReference
"operation": "shift",
"spec": {
"Orders": {
"*": {
"Headers": "Orders[#2].&",
"Goods": "Orders[#2].&",
"References": {
"*": {
"TypeReference": {
"OC": {
"@(2,ValueReference)": "Orders[#2].ValueReference"
}
}
}
}
}
}
}
},
//2ed Transformation is the same as you had except for
//fetching the isolated ValueReference above
//into its own Array based on the GoodsDetails array
{
"operation": "shift",
"spec": {
"Orders": {
"*": {
"Headers": "header",
"Goods": {
"*": {
"GoodsDetails": {
"*": {
"@(2,GoodsTypeName)": "rows.GoodsTypeName",
"Packs": "rows.Packs",
"@(4,ValueReference)": "rows.ValueReference"
}
}
}
}
}
}
}
}
] If the helps please accept solution. Thanks
... View more
05-25-2023
01:48 PM
1 Kudo
Hi, What is the format of the original flowfile content after PutDatabaseRecord\PUTSQL? if its something like xml or json you can use queryrecord processor or jolt transformation to enrich the content from the attributes.
... View more
05-19-2023
11:14 AM
It seems like the ExecuteSQL is being executed continuously because its scheduled to run as such based on the Run Schedule. You have to set up the Run Schedule to the needed frequency by selecting CORN Driven Scheduling Strategy and set the CRON Time accordingly in the Run Schedule property.
... View more