Member since
06-02-2020
40
Posts
4
Kudos Received
8
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
4011 | 09-30-2020 09:27 AM | |
2481 | 09-29-2020 11:53 AM | |
3938 | 09-21-2020 11:34 AM | |
4216 | 09-19-2020 09:31 AM | |
2525 | 06-28-2020 08:34 AM |
10-02-2020
10:33 AM
@justenji If you consider various use cases, 1)Consider the input 123;Valid;567;45. The first replace method ( replaceAll('Valid',';') ) will give 123; ;567;45 which on applying the next replace method ( replaceAll(' ',';') ) will give 123;;567;45. This is not the desired output. 2)Consider another input Valid;123;567;Valid;Valid;45. First replace method will give ;123;567; ; ;45 Second replace method will give ;;123;567;;;45. Again, this is not the desired output. To remove all the confusion, what I wanted to do was, separate all the values other than "Valid" with empty spaces. So, I replaced Valid and ; with empty spaces. But, there might be cases where there are empty spaces at the beginning or at the end ( ( 123 567 45) or (123 567 45 )). So, I used trim method to remove outer spaces. Then, you will be left with values other than "Valid" separated by spaces (can be more than 1). So, now, the second replace method will just add one semi-colon, irrespective of the number of spaces between. So, the output will be 123;567;45. Hence, the regex Valid|; will replace both "Valid" and semi-colons.
... View more
10-01-2020
03:31 AM
Hi @Karthik_Sise, Are you using DistributedMapCacheServer(DMCS) controller service along with the DistributedMapCacheClientService(DMCCS) controller service. If not, please add that DMCS too. Note that, port configuration in DMCS should be the same port that is being using by DMCCS. It doesn't matter where you add as long as both DMCS and DMCCS are using the same port.
... View more
09-30-2020
12:47 PM
Hi @calonsca! Please have a look at this spec as well! [ { "operation": "shift", "spec": { "@": "data", "ID": "&", "#${date}": "date", "#${dataset:toLower()}": "dataset" } } ]
... View more
09-30-2020
12:32 PM
Hi @justenji ! Please take a look at the below code and tell me if it is working or if you need any further upgradations. As of now, I have converted the timestamps and added dnr_group. import java.nio.charset.StandardCharsets import org.apache.nifi.components.PropertyValue import groovy.json.JsonSlurper import groovy.json.JsonOutput flowFile = session.get() if (!flowFile) return try { def jsonSlurper = new JsonSlurper(); def jsonOutput = new JsonOutput(); def input = flowFile.read().withStream { data -> jsonSlurper.parse(data) } def pattern1 = 'yyyyMMddHHmmss'; def tz1 = 'GMT+0200'; def pattern2 = 'yyyy-MM-dd HH:mm:ss'; def tz2 = 'GMT'; input.stand = convertDatePattern(input.stand,pattern1,TimeZone.getTimeZone(tz1),pattern2,TimeZone.getTimeZone(tz2)); for(int i=0;i<input.table.size();i++){ input.table[i].elem_stand = convertDatePattern(input.table[i].elem_stand,pattern1,TimeZone.getTimeZone(tz1),pattern2,TimeZone.getTimeZone(tz2)); def dnr = input.table[i].dnr.replaceAll('\\(|\\)',''); def group = input.table[i].group.replaceAll('\\(|\\)',''); if(dnr.toInteger() < 10){ dnr = '0'+dnr; } if(group.toInteger() < 10){ group = '0'+group; } input.table[i].dnr_group = "V-"+dnr+"-"+group; input.table[i].remove('dnr'); input.table[i].remove('group'); } flowFile = session.write(flowFile, { outputStream -> outputStream.write(jsonOutput.toJson(input).toString().getBytes(StandardCharsets.UTF_8)) }as OutputStreamCallback); session.transfer(flowFile, REL_SUCCESS); } catch (e) { log.error('Error Occured,{}', e) session.transfer(flowFile, REL_FAILURE) } def convertDatePattern(String input, String pattern1, TimeZone tz1, String pattern2, TimeZone tz2){ return new Date().parse(pattern1,input,tz1).format(pattern2,tz2).toString(); }
... View more
09-30-2020
11:25 AM
Hi @DataD, Please find the below spec: [ { "operation": "shift", "spec": { "rows": { "*": { "row": { "*": { "@": "[&3].@(3,header[&1])" } } } } } } ] This will give the output as: [ { "header1" : "row1", "header2" : "row2", "header3" : "row3" }, { "header1" : "row4", "header2" : "row5", "header3" : "row6" } ] I didn't convert it to {
"header1" : "row1",
"header2" : "row2",
"header3" : "row3",
"header1" : "row4",
"header2" : "row5",
"header3" : "row6"
} because that is not a valid json as header1,2 and 3 are repeated keys in the same level of the json.
... View more
09-30-2020
11:03 AM
Hi @Biswa, Please look at the below spec: [ { "operation": "shift", "spec": { "*": { "urlTypeName": { "Spring URL": { "@(2,examUrl)": "ExamDashBoardURL[]" } } } } } ] Output will be: { "ExamDashBoardURL" : [ "https://exam.test.com/page/1473161074" ] } Tell me if this is ok.
... View more
09-30-2020
10:53 AM
Hi @Ayaz , @mburgess ! Please have a look at this spec as well! [ { "operation": "shift", "spec": { "*": { "BRANCH_CODE": "[&1].Fields.FLD0001", "CUST_NO": "[&1].Fields.FLD0002", "AC_DESC": "[&1].Fields.FLD0003", "CUST_AC_NO": "[&1].ExternalSystemIdentifier", "#1": "[&1].InstitutionId" } } } ] Just FYI!
... View more
09-30-2020
10:34 AM
1 Kudo
Hi @Nidutt! Use the below Expression Language: ${literal(${allMatchingAttributes("error_field.*"):join(";")}):replaceAll('Valid|;',' '):trim():replaceAll('\s+',';')}
... View more
09-30-2020
09:27 AM
@Sru111 , If possible, please update your nifi version to 1.11.4 or above. You can find load balancing option there. Otherwise, stick to your plan of using primary node only for FetchSFTP processor. You can still do it in your nifi using Remote Process Groups. But, it will become really complex with that.
... View more
09-30-2020
08:06 AM
Hi @Sru111, Setting the FetchSFTP processor to run on primary node is fine. But, if there are multiple files that you need to fetch from SFTP using the same processor, fetching of second file will happen only after you fetch first one (Similarly for the rest). But, you can fetch them simultaneously. So, using all the 3 nodes is preferred for FetchSFTP processor. May I know which version of nifi you are using? I believe, load balance strategy was introduced in 1.11.0 (not sure) but, started working correctly in 1.11.4 version. Regarding PutHDFS, I don't have a clue about it! Sorry!
... View more