Member since
06-19-2017
62
Posts
1
Kudos Received
7
Solutions
My Accepted Solutions
| Title | Views | Posted |
|---|---|---|
| 4872 | 03-17-2022 10:37 AM | |
| 3076 | 12-10-2021 04:25 AM | |
| 3347 | 08-18-2021 02:20 PM | |
| 8539 | 07-16-2021 08:41 AM | |
| 1853 | 07-13-2021 07:03 AM |
09-05-2021
05:30 AM
Hi , We are orchestrating NiFi flow status(completion/Failed) through Control-M. This is the step below to be followed . 1. Control-M put the files in source folder (SFTP server) and will be waiting for status through Status API (which you have to build separate REST API flow using Request processor to expose the status message). 2. NiFi List SFTP will be keep listening the file(.txt) and once new file has placed by Control-M , NiFi will process the file and NiFi processor load the content into Database , Database processor will have Success and Failure relationship . Success flow files you can capture status as success and 1 value using Update attribute processor , this values should be stored into distributed cache/other storage area using relevant processor .Same process for failure (-1) flow as well . 3. Now status message stored into distributed cache/other storage area, You can query the status from distributed cache/other storage area using Fetch processor and pass to Response processor to the waiting Control-M job (Control-M should keep polling the status until it receive response 1 and -1. 4. When Control-M finds 1 value then the flow is success and if -1 then processing has failed . Thanks
... View more
08-18-2021
02:20 PM
Hello , If your input json is coming below format { "TOT_NET_AMT" : "55.00", "H_OBJECT" : "File", "H_GROSS_AMNT" : ["55.00","58.00"], "TOT_TAX_AMT" : "9.55" } If the value of H gross amount is in List of String ["55.00","58.00"] instead of String "55.00,58.00" , then you can use Jolt transformation JSON NiFi processor to define jolt spec to convert into the required output. { "TOT_NET_AMT" : "55.00", "H_OBJECT" : "File", "H_GROSS_AMNT" : "58.00", "TOT_TAX_AMT" : "9.55" } Jolt transformation JSON configuration. Jolt specification : [ { "operation": "modify-overwrite-beta", "spec": { "H_GROSS_AMNT": "=lastElement(@(1,H_GROSS_AMNT))" } } ] If the input is not list of String for H Gross amount and only string with comma speparated, then please follow the below steps in order. you will have to extract whole json into single attribute using extracttext processor and the attribute will hold entire json content , Next you can use EvaluateJsonPath which extracts each json element into each attributes , once you have all 4 attributes with value after EvaluateJsonPath , then you construct the json in Replace text processor ,H_GROSS_AMNT will still hold the string comma separated and u can use H_GROSS_AMNT:substringAfterLast(,) which extracts last element from the string value . Examples of EvaluateJsonPath https://nifi.apache.org/docs/nifi-docs/html/expression-language-guide.html#jsonpath
... View more
07-30-2021
01:09 PM
Hi, Once you extracted the header into Flowfile attribute using ExtractText processor, next you are going to convert the header flowfile attribute into Flow file content OR you can keep the header value as in attribute ..The stack overflow explains about extracting header into flowfile attribute and next they have pass the headers as file into destination .To convert flowfile attribute to file/flowfile content ,we will have to use ReplaceText processor where you can pass flowfile attributes .. The success relationship of ReplaceText will only contains header in flowfile content and the original csv file will be replaced with header as content . The flowfile content, you can transfer to destination or next processor in the flow. Hope this information you are looking for .. Thanks
... View more
07-19-2021
07:59 AM
Hi , If Controller Services created at base processor group level , then you will not get controller services when you create a template only for the subprocessor group . You will have to create a template for base processor group of immediate child processor group or define the distributed mapcache controller service in the processor group which you would want to create a template. For example 'Stream Ingestion' is immediate super processorgroup of 'TestSample' processorgorup . If i create a template for 'TestSample' processor group then in the template i can not see the DistributedMapCacheServer controller service because the specific controller service scope is in Streaming Ingestion processgroup as a template. Controller Services created within a Process Group will be available/referenced to all descendant components. Thanks
... View more
07-19-2021
07:22 AM
Hi alexmarco, I did not find the 'Upload/Attach' option to upload the template file . Could you please follow the steps/screenshots mentioned , it should work for your example well. thanks
... View more
07-16-2021
08:41 AM
Hi alexmarco, If your final json format is fixed and input json also coming in same format always with values different then you extract keyword value (foo or new value) into flowfile attribute and you can use attribute in following replacetext processor to pass the value . ExtractText processor --> UpdateAttributeProcessor --> ReplaceText Processor 1. Add a new property(keyword_value) in ExtractText and value/expression should be below ("keyword":.*) 2. Remove space,double quotes from the extracted keyword_value attribute in UpdateAttributeProcessor. (You can add the updateattribute logic directly in Replacetext processor itself for retrieving the keyword_value also as ReplaceText processor supports NiFi expression language. Its optional and you avoid Updateattribute processor in this flow then if you chose ) 3. Append the keyword_value in ReplaceText processor (keep the final json ) as in sample . 4. Connect the success flow into Invokehttp processor. * CreateKeywordvalueAttribute (Extract processor) expression below : ("keyword":.*) * Updateattributeprocessor ${keyword_value:substringAfter(':'):trim():replace('"', '')} * FinalReplaceText processor : Place the below JSON into Replacement Value section of the processor {
"requests": [
{
"source": "blablabla",
"params": {
"keywords": [
"${keyword_value}"
],
"sub-field1": "1"
}
}
],
"field1": "1",
"field2": "2",
"field3": false
} I have attached the sample tested flow(.xml) for your reference . Please accept the solution if it works as expected. Thanks Adhi
... View more
07-13-2021
07:03 AM
We have solved this with help of wait and notify processor as we routed .hql file to puthql which interns routes success/failure to Notify . The Wait processor wait signal will release the .csv file to put into HDFS once Notify signal comes from Notify processor.
... View more
07-13-2021
06:52 AM
Hi Deepika, I assume that you are using ConsumeKafka OR ConsumeKafka2.0 version NiFi processor ,when you select option in SSL Context Service as StandardSSLContextService then you have to select right arrow as indicated below image . Properties of StandardSSLContextService will be prompted to you and you can enter the values of the properties Keystore Filename,Keystore Password,Key Password,Truststore Filename,Truststore Type. etc. After providing the values of the SSL properties , enable the Controller Service and start running the ConsumeKafka processor. ConsumeKafka2.0 processor properties Please update if above steps expected and works. Thanks
... View more
07-01-2021
12:45 PM
Hi , The InvokeHTTP processor provides couple of the write attributes with value for example invokehttp.status.code 401 invokehttp.status.message Unauthorized when you have this attributes in Failure or No Retry relationship ,you can use Replace text processor as below to overwrite the original flow into new flow file so that you send that in email . ReplaceText processor properties If you are looking for lot more message of the response body , please check whether you can configure in any attribute in InvokeHTTP processor so that you can use it ReplaceText processor to overwrite the original flow file
... View more
07-01-2021
04:22 AM
Hi, We are processing ZIP file contains multiple timestamp files (.hiveql,.csv) in distributed manner . We check the file extension whether it is .hql or .csv then we route the file to execute it PutHiveQL and PutHDFS processor respectively. The files(timestamp order starts with for example t1 or system timestamp) below contains in ZIP file to be extracted and processed in order. table_info.zip table_info_t1.hql
table_info_t1_1.csv
table_info_t1_2.csv
table_info_t2.hql
table_info_t2_1.csv
table_info_t2_2.csv
table_info_latest.hql
table_info_latest.csv Please find the below NiFi flow and RouteonAttribute property Is there any way to make us to wait first puthivesql executes first and give indication to putHDFS execution next for each timestamp file one by one order. Can we group each timestamp files into group and process the .hql file and the put .csv file into HDFS? @Nifi
... View more
Labels:
- Labels:
-
Apache NiFi