Member since
07-08-2016
260
Posts
43
Kudos Received
9
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
576 | 05-02-2018 06:03 PM | |
944 | 10-18-2017 04:02 PM | |
331 | 08-25-2017 08:59 PM | |
490 | 07-21-2017 08:13 PM | |
4160 | 04-06-2017 09:54 PM |
10-18-2018
05:46 AM
Hi, We have noticed some of our nifi.err.log and nifi.wrapper.log files are growing bigger in to multiple GBs. and i dont see any documentation on how to clear or roll those and also why are they keep getting bigger with messages like these below.. No
Instance(s) Available. No
Instance(s) Available. No
Instance(s) Available. Node
- USDNYL4071 ERROR: Description
= Invalid query ERROR:
The search filter cannot be recognized. ERROR:
The search filter cannot be recognized. ERROR:
The search filter cannot be recognized. ERROR:
The search filter cannot be recognized. ERROR:
The search filter cannot be recognized. ERROR:
The search filter cannot be recognized.
... View more
Labels:
08-02-2018
04:41 PM
Hi @Nikita Buxy , Did you able to solve this , if so how.? I got an .asc file and passpharse from our vendor and i am trying to use EncryptContent processor to encrypt the files. i converted .asc to .gpg using this command , gpg --dearmor C:\SaiDEV\Backup.asc it created a Backup.asc.gpg file and i am pointing that in private keyring file. the EncryptContent is throwing the same error as you pointed above.. 11:34:32 CDT
ERROR
fb10a940-0164-1000-a27b-69c298405157
EncryptContent[id=fb10a940-0164-1000-a27b-69c298405157] Cannot decrypt StandardFlowFileRecord[uuid=11dcb47f-d30d-43b8-82d3-c80f7523d8ec,claim=StandardContentClaim [resourceClaim=StandardResourceClaim[id=1533227197998-43, container=default, section=43], offset=385301, length=128433],offset=0,name=Test.txt,size=128433] - : org.apache.nifi.processor.exception.ProcessException: Exception creating cipher Hi @Andy LoPresto , any help here..?? Regards, Sai
... View more
08-01-2018
04:46 PM
Hi , just trying to get some attention from experts.. any idea on how to decrypt PGP file using Passphrase and secrect key (Private Key). Regards, Sai
... View more
07-30-2018
06:11 PM
Hi @Matt Burgess , is there anyway I can use the "validaterecord" to just validate if its following a schema and then route to valid. don't know why we need to have the "Record Writer" for validation. its changing the file format a little bit. moving the tags order etc.. I just want the input file as output if its valid without changing the contents. or is there any other way that I can achieve this.? Regards, Sai
... View more
07-25-2018
04:11 PM
@Matt Burgess , thank you. I didn't know about Validate Field Names.
... View more
07-24-2018
09:05 PM
Hi @Matt Burgess , any idea what i am doing wrong in the above case.?
... View more
07-23-2018
09:26 PM
1 Kudo
Hi, i need to ingest only the JSON files following a valid schema. i am trying to achieve this by validate-record processor. i am supplying the same schema for both JSONTreeReader and JSONRecsetWriter. I am not using AVRO because my input contains _ in the names. (but i came up with this schema by modifying the input file without _ and using inferAvroSchema and then changed both to use _ to match the input file) my schema and files are matching but its sending it to invalid relation. Anything wrong that i am doing..?? Schema : {
"type": "record", "name": "iHist", "fields": [ {
"name": "file_name",
"type": "string"
}, {
"name": "plant",
"type": "string"
}, {
"name": "collector",
"type": "string"
}, {
"name": "name",
"type": "string"
}, {
"name": "unique_id",
"type": "string"
}, {
"name": "description",
"type": "string"
}, {
"name": "general_1",
"type": "string"
}, {
"name": "general_2",
"type": "string"
}, {
"name": "general_3",
"type": "string"
}, {
"name": "general_4",
"type": "string"
}, {
"name": "general_5",
"type": "string"
}, {
"name": "data_points",
"type": {
"type": "array", "items": {
"type": "record", "name": "data_points", "fields": [ {
"name": "timestamp",
"type": "string"
}, {
"name": "value",
"type": "string"
},
{
"name": "quality",
"type": "string"
}
]
}
}
}
]
} Data file.. {
"file-name": "tp-tcollec.tag.json", "plant": "P11A3", "collector": "test_Collector", "name": "tag_SAFETY_MARGN.F_CV", "unique-id": "1532358720761", "description": "test", "general-1": "", "general-2": "", "general-3": "", "general-4": "", "general-5": "", "datapoints": [ {
"timestamp": "2016-07-19T10:25:43.000Z",
"value": "177",
"quality": "100"
}, {
"timestamp": "2016-07-19T10:25:42.000Z",
"value": "177",
"quality": "100"
}, {
"timestamp": "2016-07-19T10:25:41.000Z",
"value": "177",
"quality": "100"
}
]
} I just need to validate if the input file is following the schema. any better ways to do this.??
... View more
Labels:
07-20-2018
07:58 PM
Hi Experts, any help here..How do we decrypt PGP encrypted file.? I have Passphrase and Private Key..are those enough.? Regards, Sai
... View more
07-20-2018
03:45 PM
@ashok.kumar , I already tried copying the private key to a file and pointing to that , but see the same error. the processor still shows the same warning.
... View more
07-20-2018
02:18 PM
@ashok.kumar , The private key and the passphrase was given to me by the 3rd party vendor. Can I directly use them in the processor as in the screenshot or Should I use some commands to generate Private Keyring file.? if so how to do that.? Regards, Sai
... View more
07-19-2018
06:32 PM
Hi, We downloaded a file from our vendors AWS S3 which is PGP encrypted and i am trying to Decrypt it using EncryptContent processor. I set the Private Key and Passphase provided by them. But the processor shows a warning on the top left with Private Keyring file is invalid and could not be opened with the Password phrase provided. am i doing something wrong.? Should i not include -----BEGIN PGP PRIVATE KEY BLOCK----- and -----END PGP PRIVATE KEY BLOCK-----.? i tried removing it and same thing happens. Attached the screen shots. Regards, Sai
... View more
Labels:
07-17-2018
06:09 PM
any input from the experts.??
... View more
07-16-2018
08:47 PM
Hi, Our OU in the LDAP connection has changed and we are getting invalid ID,PWD errors when trying to log in to NiFi, I think this is expected. But if i change the old ou=Users and Groups,ou=EU,ou=same,ou=Organizations,dc=comp,dc=com to new ou=Users and Groups,ou=USA,ou=same,ou=Organizations,dc=comp,dc=com do i need to restart nifi.?? is there a way that this could be avoided .?? that is instead of giving the whole string can i just give the top level ou like ou=Users and Groups Regards, Sai
... View more
Labels:
07-03-2018
09:21 PM
@Matt Burgess, hum , why wouldnt it work for me..i am running on windows..but that shouldnt stop it right.?? Regards, Sai
... View more
07-02-2018
08:11 PM
@Matt Burgess is this what you are referring to.? this is giving filename2 as empty string.
... View more
07-02-2018
07:38 PM
Hi , i have a JSON file like below. currently i am using EvaluateJSONPath to get the filename and unique-id and updateattribute to concatenate those 2 into one filename using ${filename:append('_'):append(${unique-id})} can i do that in EvaluateJSONPath only.? basically i want to avoid 2nd processor if it can be done. Regards, Sai
... View more
Labels:
07-02-2018
07:25 PM
Hi, Is there anyway that we can run Remote Processor Group with multi threading (concurrent tasks) in NiFi Site-2-Site.? Regards, Sai
... View more
Labels:
06-19-2018
05:04 PM
1 Kudo
@Matt Burgess, if we follow your approach , we can then use INSERT INTO SELECT to load from Avro table to ORC table.? reason being in some cases like ACID in Hive only works in ORC format.
... View more
06-11-2018
09:04 PM
@Matt Burgess , i think i will try to do this as suggested by Bryan some where else.. You can use MergeContent and set the Delimiter Strategy to "Text" and then enter [ , ] for the header, demarcator, and footer respectively...i think it should work.
... View more
06-11-2018
08:55 PM
Hi @Matt Burgess, I have another challenge that i need to merge these smaller JSON files in to 1 file. lets say i need to merge 1000 files that i get from above JOLT in to one single json file and load in to HDFS. how can i do that as the input files will be combined one after other using the MergeContent processor and the resulting file wont be a valid one. any idea how to solve this.? Regards, Sai
... View more
06-06-2018
04:45 PM
Hi,
i need to transform the input JSON below {
"TransactionId": "-1",
"Source": "ihrs",
"Name": "tag-data",
"Id": "126",
"Records": [
{
"Name": "tag-master",
"PayLoad": {
"file-name": "testfile.json",
"plant": "1086",
"collector": "testcollector",
"tag-name": "testtag",
"tag-description": "test desc"
}
},
{
"Name": "tag-detail",
"PayLoad": {
"tag-value": "98998.55",
"tag-timestamp": "2018-01-02T16:09:39.000000+00:00",
"tag-quality": "100"
}
},
{
"Name": "tag-detail",
"PayLoad": {
"tag-value": "91009.47",
"tag-timestamp": "2018-01-02T16:09:340000000+00:00",
"tag-quality": "100"
}
},
{
"Name": "tag-detail",
"PayLoad": {
"tag-value": "91021.80",
"tag-timestamp": "2018-01-02T16:09:41.000000+00:00",
"tag-quality": "100"
}
}
]
} to output like below. {
"TransactionId": "-1",
"Source": "ihrs",
"Name": "tag-data",
"Id": "100",
"Records": [
{
"Name": "tag-master",
"PayLoad": {
"file-name": "testfile.json",
"plant": "1000",
"collector": "testcollector",
"tag-name": "testtag",
"tag-description": "test desc",
"Children": [
{
"Name": "tag-detail",
"PayLoad": {
"tag-value": "98998.55",
"tag-timestamp": "2018-01-02T16:09:39.000000+00:00",
"tag-quality": "100"
}
},
{
"Name": "tag-detail",
"PayLoad": {
"tag-value": "98998.55",
"tag-timestamp": "2018-01-02T16:09:39.000000+00:00",
"tag-quality": "100"
}
}
]
}
}
]
} i need to move the details records under master as array..
basically first element of the record array (tag master)is parent to next 3 elements(tag details). i tried to use the convert record processor with JSON converters giving correct input and output schemas for files. it didnt convert. it looks like i may have to write JOLT which i need some help with.
Thanks in advance!!! attached files for reference.. input.jsonoutput.json Regards,
Sai
... View more
Labels:
05-29-2018
08:19 PM
Hi, On my windows NiFi server , I am able to send unauthenticated emails using PutEmail processor by supplying SMTP Host name, Port 25 and user credentials. But in the email subject starts with UNVERIFIED SENDER. but when I tried with port 587 which we are using for authenticated emails , I am getting the below error. I am able to send emails using port 587 using PowerShell command. so i know the user has permissions to send emails on that port. any ideas on how to solve this.??? 15:10:47 CDT
ERROR
015e112b-6c5c-180e-115e-cece012de960 PutEmail[id=015e112b-6c5c-180e-115e-cece012de960] Failed to send email for StandardFlowFileRecord[uuid=741a83b9-ee95-4bef-aea4-93ae30d28c17,claim=StandardContentClaim [resourceClaim=StandardResourceClaim[id=1527624082884-412, container=default, section=412], offset=361603, length=4],offset=0,name=1232268045691876,size=4]: 535 5.7.3 Authentication unsuccessful
; routing to failure: javax.mail.AuthenticationFailedException: 535 5.7.3 Authentication unsuccessful
... View more
Labels:
05-29-2018
04:18 PM
@Matt Burgess But ListHDFS will keep the state and only supposed to pull the changed files.right.?? @Paul Hernandez what are the properties of your ListHDFS.?
... View more
05-18-2018
08:54 PM
@Shu, i thought about this , but the only issue is my files are huge and to split them by lines may not be ideal.
... View more
05-18-2018
07:36 PM
Hi, I am trying to update header of my CSV file with a regular expression to remove special chars from header line only. how to do that.? i tried to do that by reading the file and on one route to RouteText,ReplaceText,ExtractText to get the firstline and storing it in headerline attribute. and another route to move the file without the header and tried to Merge it by using headerline from route1. But It only shows header in output when i first file arrives from Route 1 into MergeContent processor as it has the headerline property where as if it gets it from Route 2 , the output file doesnt have the headerline as it doent have that property. any idea how to solve this..?
... View more
Labels:
05-17-2018
06:32 PM
2 Kudos
@Chandan singh looks like your PutJMS is still running , you wont be able to clear the queue until it finishes . if your process is hung , you have to restart NiFi.
... View more
05-02-2018
06:03 PM
@vinayak krishnan , if you are using the unpackcontent processor , fragment.count should give you the count. Regards, Sai
... View more
04-16-2018
01:29 PM
@jwitt, @Shu, Thanks for your input , I am actually following the similar approach. But was trying to see if there is any thing I can do at a file level instead of splitting it . Regards, Sai
... View more
04-13-2018
03:56 PM
Hi, I have a JSON file coming in from google big query in the format below..(it looks like Google BQ only supports extracting as JSON with newline delimited ). {“person”:”person1 value”} {“person”:”person2 value”} so i am not able to use that file as regular JSON file and not able to use any JSON processors (Evaluate,Split etc) ..how can i change it to a valid readable file (look below) and use it with other processors. [{“person”:”person1 value”}, {“person”:”person2 value”}] i do not have a schema for incoming JSON file. Thanks, Sai
... View more
Labels: