Member since
06-14-2022
62
Posts
2
Kudos Received
0
Solutions
05-21-2025
10:06 PM
Hi, can you please help? @MattWho
... View more
05-21-2025
01:59 AM
Hi, What is the steps to create Python Custom Processor in NiFi 2.3.0 Version. I have created it but its not showing in NiFi GUI. Can you please help me with the process/steps to create python custom processor in NiFi. Thank you
... View more
Labels:
- Labels:
-
Apache NiFi
09-23-2024
03:50 AM
1 Kudo
I am running it once. I am using cron to scheduled the job @MattWho
... View more
09-11-2024
10:48 PM
1 Kudo
We are using ConsumeImap Processor in NiFi. I have schedule the job to run once in the day. When the job is running at scheduled time its reading all the mails but processing only one. But when I am triggering it manually, its reading all the mails and processing all the mails. What could be the issue? below is the configuration for the same
... View more
Labels:
- Labels:
-
Apache NiFi
03-21-2023
06:16 AM
Yes, I am doing the same. And how to maintain the order of the dynamic properties in Invokehttp processor as the order is also important else it is saying bad request. Whenever i am adding those, these are automatically arranging the alphabetical orders instead the order on which i am adding
... View more
03-21-2023
05:57 AM
Hi @steven-matison , I am using the same. I need to upload the file using this method. And also order of dynamic properties important. You know, have can i achieve it?
... View more
03-21-2023
04:11 AM
Hi Team, How to invoke api of Content-Type multipart/form-data using invokehttp processor in nifi. It's not working for me. If anyone has used the same, can you please share. Thank you.
... View more
Labels:
- Labels:
-
Apache NiFi
02-09-2023
05:38 AM
I need to use the controller service with credentials but still am getting the error. You have an Idea how can i do that. Any suggestions?
... View more
02-09-2023
04:06 AM
Hi Team, I have an requirement to generate temporary credentials of AWS and fetch files from the S3 bucket and use this credentials in FetchS3Object But at FetchS3Object processor I am getting below error while fetching files from the bucket: "FetchS3Object[id=xxx] Failed to retrieve S3 Object for FlowFile[filename=xyz.json]; routing to failure: com.amazonaws.SdkClientException: Unable to load AWS credentials from any provider in the chain: [EnvironmentVariableCredentialsProvider: Unable to load AWS credentials from environment variables (AWS_ACCESS_KEY_ID (or AWS_ACCESS_KEY) and AWS_SECRET_KEY (or AWS_SECRET_ACCESS_KEY)), SystemPropertiesCredentialsProvider: Unable to load AWS credentials from Java system properties (aws.accessKeyId and aws.secretKey), WebIdentityTokenCredentialsProvider: To use assume role profiles the aws-java-sdk-sts module must be on the class path., com.amazonaws.auth.profile.ProfileCredentialsProvider@3075e891: profile file cannot be null, com.amazonaws.auth.EC2ContainerCredentialsProviderWrapper@2162bad1: Failed to connect to service endpoint: ]" can you please help me with the issue
... View more
Labels:
- Labels:
-
Apache NiFi
01-11-2023
01:21 AM
Hi all, I have an requirement where I need to parse the data into the required format Input: { "Message" : "\nRecord 1:\nRequired data is missing. \n\nRecord 2:\nprocessing failed\n" } Here the content and delimiters are not fixed. The fixed part is only /nRecord keyword on which I am writing the Script. But I am not getting desired Output using Groovy. desired Output: [{ "Record 1": "nRequired data is missing" }, { "Record 2": "processing failed" }] I have written Groovy Script for the same but I am getting empty array. import org.apache.commons.io.IOUtils import groovy.json.* import java.util.ArrayList import java.nio.charset.* import java.nio.charset.StandardCharsets import groovy.json.JsonSlurper import groovy.json.JsonBuilder def flowFile = session.get() if(!flowFile) return try { flowFile = session.write(flowFile, { inputStream, outputStream -> def text = IOUtils.toString(inputStream, StandardCharsets.UTF_8) splitted = text.split('\nRecord') int j = splitted.size() final1 = [] for (int i=0;i<j-1;i++) { k = "Record " + splitted[i+1] valid = k.replaceAll("\\n|\"|\\n|}","") final1.add("{\"" + valid.replaceFirst(":",'":"')+ "\"}" ) } def json = JsonOutput.toJson(final1) outputStream.write(JsonOutput.prettyPrint(json).getBytes(StandardCharsets.UTF_8)) } as StreamCallback) session.transfer(flowFile, REL_SUCCESS) } catch(Exception e) { log.error('Error during JSON operations', e) flowFile = session.putAttribute(flowFile, "error", e.getMessage()) session.transfer(flowFile, REL_FAILURE) } Can you please help me with the same. Thank you.
... View more
Labels:
- Labels:
-
Apache NiFi