Member since
06-02-2020
40
Posts
4
Kudos Received
8
Solutions
My Accepted Solutions
| Title | Views | Posted |
|---|---|---|
| 5819 | 09-30-2020 09:27 AM | |
| 3701 | 09-29-2020 11:53 AM | |
| 5879 | 09-21-2020 11:34 AM | |
| 5885 | 09-19-2020 09:31 AM | |
| 3645 | 06-28-2020 08:34 AM |
09-30-2020
12:47 PM
Hi @calonsca! Please have a look at this spec as well! [ { "operation": "shift", "spec": { "@": "data", "ID": "&", "#${date}": "date", "#${dataset:toLower()}": "dataset" } } ]
... View more
09-30-2020
11:25 AM
Hi @DataD, Please find the below spec: [ { "operation": "shift", "spec": { "rows": { "*": { "row": { "*": { "@": "[&3].@(3,header[&1])" } } } } } } ] This will give the output as: [ { "header1" : "row1", "header2" : "row2", "header3" : "row3" }, { "header1" : "row4", "header2" : "row5", "header3" : "row6" } ] I didn't convert it to {
"header1" : "row1",
"header2" : "row2",
"header3" : "row3",
"header1" : "row4",
"header2" : "row5",
"header3" : "row6"
} because that is not a valid json as header1,2 and 3 are repeated keys in the same level of the json.
... View more
09-30-2020
11:03 AM
Hi @Biswa, Please look at the below spec: [ { "operation": "shift", "spec": { "*": { "urlTypeName": { "Spring URL": { "@(2,examUrl)": "ExamDashBoardURL[]" } } } } } ] Output will be: { "ExamDashBoardURL" : [ "https://exam.test.com/page/1473161074" ] } Tell me if this is ok.
... View more
09-30-2020
10:53 AM
Hi @Ayaz , @mburgess ! Please have a look at this spec as well! [ { "operation": "shift", "spec": { "*": { "BRANCH_CODE": "[&1].Fields.FLD0001", "CUST_NO": "[&1].Fields.FLD0002", "AC_DESC": "[&1].Fields.FLD0003", "CUST_AC_NO": "[&1].ExternalSystemIdentifier", "#1": "[&1].InstitutionId" } } } ] Just FYI!
... View more
09-30-2020
09:27 AM
@Sru111 , If possible, please update your nifi version to 1.11.4 or above. You can find load balancing option there. Otherwise, stick to your plan of using primary node only for FetchSFTP processor. You can still do it in your nifi using Remote Process Groups. But, it will become really complex with that.
... View more
09-30-2020
08:06 AM
Hi @Sru111, Setting the FetchSFTP processor to run on primary node is fine. But, if there are multiple files that you need to fetch from SFTP using the same processor, fetching of second file will happen only after you fetch first one (Similarly for the rest). But, you can fetch them simultaneously. So, using all the 3 nodes is preferred for FetchSFTP processor. May I know which version of nifi you are using? I believe, load balance strategy was introduced in 1.11.0 (not sure) but, started working correctly in 1.11.4 version. Regarding PutHDFS, I don't have a clue about it! Sorry!
... View more
09-29-2020
12:06 PM
Hi @Sru111 , Consider the MergeContent processor in the picture as your MergeContent processor. Configure the queue(here, 'success') that acts as the upstream queue for your MergeContent processor. Select the load balance strategy as Single node, then you will get all the files as input to only one of the nodes. (Optional) You can configure the donwstream queue of MergeContent processor to have the load balance strategy as Round Robin, so that the files are distributed among all the nodes in the cluster.
... View more
09-29-2020
11:53 AM
Hi @Sru111 , For FetchSFTP processor, change the Bulletin level to NONE. Then, that error won't appear. If there are any other errors like authentication error or communication timeout, bulletin won't appear for them as well. But, the flowFiles will still be routed to the respective relations.
... View more
09-27-2020
08:08 AM
Hi @wcbdata Can you explain the usage of '#' in the spec you used above: {
"operation": "shift",
"spec": {
"tmp": {
"*": {
"0": {
"*,*,*,*": {
"@(4,runid)": "particles.[#4].runid",
"@(4,ts)": "particles.[#4].ts",
"$(0,1)": "particles.[#4].Xloc",
"$(0,2)": "particles.[#4].Yloc",
"$(0,3)": "particles.[#4].Xdim",
"$(0,4)": "particles.[#4].Ydim"
}
}
}
},
"*": "&"
}
}
... View more
09-21-2020
11:34 AM
Hi @DarkStar Your Expression Language(EL) should be ${field.value:substring(0,28):toDate("MMM dd,yyyy HH:mm:ss.SSSSSSSSS"):format("yyyy-MM-dd HH:mm:ss.SSSSSS")} In your EL, the pattern you used has 6 "S". But, the input has precision upto 9. Since, you gave 6, it is reading the last 6 characters, i.e., 388267000. I think, that must be the reason.
... View more