Member since
11-16-2015
892
Posts
650
Kudos Received
245
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
5667 | 02-22-2024 12:38 PM | |
1389 | 02-02-2023 07:07 AM | |
3085 | 12-07-2021 09:19 AM | |
4205 | 03-20-2020 12:34 PM | |
14160 | 01-27-2020 07:57 AM |
04-01-2018
04:18 PM
@Chen Yimu Did the answer help in the resolution of your query? Please close the thread by marking the answer as Accepted!
... View more
03-27-2018
12:16 PM
The relevant part of the log is "Address already in use", looks like you've configured ListenHttp to listen on port 8081 but some other process is using that port already.
... View more
03-28-2019
02:01 PM
@Matt Burgess It works fine if there is just one object in the input tree if there are more it makes them as an array rather than separate records. Like {
"agent_submit_time" : [ -1, -1 ],
"agent_end_time" : [ 123445, 123445 ],
"agent_name" : [ "Marie Bayer-Smith", "Marie Bayer-Smith" ]
} I would like to to be something like [
{
"agent_submit_time" : -1,
"agent_end_time" : 123445,
"agent_name" : "Marie Bayer-Smith"
},
{
"agent_submit_time" : -1,
"agent_end_time" : 123445,
"agent_name" : "Marie Bayer-Smith"
}
] How to do that. I tried but I couldnt replaceing "*": "&" with "@": "[&]" makes it separate but the transformation of - to _ doesnt takes place.
... View more
03-12-2018
01:04 PM
Native through the standard Hadoop Java client.
... View more
03-05-2018
06:15 PM
My file extension is avro i can't use def schema = ff.read().withReader("UTF-8"){newJsonSlurper().parse(it)} I have to use each and then parse schema. Do you an example plz @Matt Burgess
... View more
02-22-2018
06:04 PM
Thanks for replying @Abdelkrim Hadjidj @Matt Burgess. Now I have two different schemas for input and output CSV for LookupRecord. I modified Result RecordPath to /COMPANY. When I start LookupRecord it takes in 6 flow files but doesn't return anything.
... View more
04-10-2019
06:24 PM
Hi @@dhieru singh Thank you for the post , but I am unable to get the values of $.component.backPressureObjectThreshold $.status.aggregateSnapshot.flowFilesQueued after processing the EvaluateJsonPath.
... View more
06-22-2018
02:16 PM
1 Kudo
The process group name can actually be found with the attached groovy code: def flowFile = session.get()
if(!flowFile) return
processGroupName = context.procNode?.getProcessGroup().getName()
flowFile = session.putAttribute(flowFile, 'processGroupName', processGroupName)
session.transfer(flowFile, REL_SUCCESS)
... View more
01-30-2018
01:53 PM
1 Kudo
@Anil Reddy In addition to Matt's answer if you want to know all supported Functions in Expression Language for your version of NiFi then click on Right Up Corner GlobalMenu Button And click on Help 3.Then Click on Expression Language Guide on Right Side will shows up all the functions that are supported in your version of NiFi. If you want to implement ${closed_epoch:format("yyyy", "GMT")} this expression even though it is not supported in your version of NiFi then as a work around you can use plus,minus functions. Assuming your closed_epoch attribute value is 1453843201123 ${closed_epoch:minus(86400000)} //this expression will minus 86,400,000 Milliseconds(i.e 24 hrs) from closed_epoch value. New value for closed_epoch will be 1453756801123. *if you are having DayLightSaving time then you need to change the milliseconds value in minus(or) plus function.
... View more
01-24-2018
03:44 AM
Schema schema = .... @SuppressWarnings("static-access")
Decoder binaryDecoder = DecoderFactory.get().get().binaryDecoder(value.toString().getBytes(), null);
System.out.println("test");
@SuppressWarnings("rawtypes")
GenericDatumReader<GenericRecord> payloadReader = new SpecificDatumReader<GenericRecord>(schema);
System.out.println("test1" + binaryDecoder.toString().length()+ "test");
GenericRecord recode = null;
recode = payloadReader.read(recode, binaryDecoder);
System.out.println("test2");
... View more