Member since
07-13-2020
58
Posts
2
Kudos Received
10
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
1226 | 09-04-2020 12:33 AM | |
7790 | 08-25-2020 12:39 AM | |
2449 | 08-24-2020 02:40 AM | |
2170 | 08-21-2020 01:06 AM | |
1160 | 08-20-2020 02:46 AM |
07-20-2021
01:18 AM
@Althotta, Has any of the replies in the post helped you resolve your issue? If so, please mark the appropriate reply as the solution, as it will make it easier for others to find the answer in the future.
... View more
07-14-2021
12:36 AM
Is it possible to show what the character is exactly? With logical type string it should accept any character as long as is it not a invalid or garbage value....
... View more
07-13-2021
12:38 PM
@Ash1 Not clear from yoru query how files are getting into or out of your NiFi. Assuming you have already received a set fo FlowFiles for the day in to your NiFi dataflow, the best approach may be to notify if any fails to be written out /transferred at the end of your dataflow. In this manor not only would you know that not all Files transferred, but would know exact what file failed to transfer. There are numerous processors that handle writing out FlowFile content (transfer) to another source or local file system. Those processing components typically have relationships for handling various types of failures. These relationships could be sent through a retry loop via the RetryFlowFile [1] processor back to the same transfer processor that failed. You define in the RetryFlowFile processor how many times you want a FlowFile to traverse this loop. After X number of loop it would get routed out of the loop to your PutEmail [2] processor where you could dynamically set the email content to include attributes from that FlowFile like filename, hostname of NiFi that failed to transfer it, etc... From the PutEmail processor you could send that FlowFile to somewhere else for holding until manual intervention was taken in response to that email. [`1] https://nifi.apache.org/docs/nifi-docs/components/org.apache.nifi/nifi-standard-nar/1.13.2/org.apache.nifi.processors.standard.RetryFlowFile/index.html [2] https://nifi.apache.org/docs/nifi-docs/components/org.apache.nifi/nifi-standard-nar/1.13.2/org.apache.nifi.processors.standard.PutEmail/index.html If you found any of the response given here assisted with your query, please take a moment to login and click "Accept" on each of those solutions. Thank you, Matt
... View more
07-08-2021
06:59 AM
Thank you for your participation in Cloudera Community. I'm happy to see you resolved your issue. Please mark the appropriate reply as the solution, as it will make it easier for others to find the answer in the future.
... View more
07-05-2021
04:55 AM
You can tail nifi-app.log from within nifi and use multi-line regex function to extract the response. If you find the answer helpful please accept this as a solution.
... View more
07-05-2021
04:44 AM
This is a heap space problem. Check if your nifi cluster is big enough to process the amount of data you are consuming from Kafka. Maybe try with a smaller dataset. If you find the answer helpful please accept this as a solution.
... View more
07-05-2021
02:20 AM
You need to provide more info here. What is the data type of each column? How are you adding data? What is the data format of the hive table? When you use you get the correct result, is it the same result or some other rows.
... View more
07-05-2021
02:09 AM
Check your Kerberos credentials cache Also note keyring is not completely compatible. You may have to use file credential cache. If this doesnt work, please send some stack trace to understand the problem.
... View more
03-01-2021
06:58 PM
hi, could it have any other solutions to solve 403 error ? like this, I only use local hadoop.tar.gz to initial my cluster. I do not use repository. Could you please give me some suggestions ? thanks a lot.
... View more
09-24-2020
12:12 AM
Hi...you can use compress content to decompress. I am not 100% if it decompresses lzo files. If not, you can executestreamcommand to run a shell command to uncompress the files. Hope this helps.
... View more