Member since
02-01-2022
274
Posts
97
Kudos Received
60
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
402 | 05-15-2025 05:45 AM | |
3396 | 06-12-2024 06:43 AM | |
5926 | 04-12-2024 06:05 AM | |
4065 | 12-07-2023 04:50 AM | |
2184 | 12-05-2023 06:22 AM |
08-25-2023
11:33 AM
Hello @steven-matison , thanks for your reply! I was able to install RDP on my EC2 instance and access the NiFi UI successfully. How can I make it accessible using my browser and not accessing EC2 through RDP?
... View more
08-24-2023
12:31 PM
@kothari It is not Ranger's job to inform the client applications using Ranger what users belong to what group. Each client application is responsible for determining which groups the user authenticated into that service belong to. The policies generated by Ranger are downloaded by the client applications. Within that downloaded policy json will be a resource identifier(s), list if user identities authorized (read, write, and/or delete) , and list of group identities authorized (read, write, or delete) against each resource identifier. So when client checks the downloaded policies from Ranger it is looking for the user identity being authorized and if client is aware of the group(s) that user belongs to, will also check authorization for that group identity. so in your case, it i s most likely that your client service/application has not been configured with the same user and group association setup in your Ranger service. If you found that the provided solution(s) assisted you with your query, please take a moment to login and click Accept as Solution below each response that helped. Thank you, Matt
... View more
08-24-2023
06:17 AM
I'm facing the same issue with the ADF Hive connector. It would be great if you could provide your configuration details.
... View more
08-22-2023
05:24 AM
@sahil0915 What you are proposing would require you to ingest into NiFi all ~100 million records from DC2, hash that record, write all ~100 million hashes to a map cache like Redis or HBase (which you would also need to install somewhere) using DistributedMapCache processor, then ingest all 100 million records from DC1, hash those records and finally compare the hash of those 100 million record with the hashes you added to the Distributed map cache using DetectDuplicate. Any records routed to non-duplicate would represent what is not in DC2. Then you would have to flush your Distributed Map Cache and repeat process except this time writing the hashes from DC3 to the Distributed Map Cache. I suspect this is going to perform poorly. You would have NiFi ingesting ~300 million records just to create hash for a one time comparison. If you found that the provided solution(s) assisted you with your query, please take a moment to login and click Accept as Solution below each response that helped. Thank you, Matt
... View more
08-21-2023
08:37 AM
Let's take this a different direction... open up a code box in your reply. Choose Preformatted: Insert Lines 0 - 11 here Remove anything sensitive of course.
... View more
08-21-2023
06:29 AM
@learner-loading were you able to resolve your issue? If any of the above posts were the solution please mark the appropriate, as it will make it easier for others to find the answer in the future.
... View more
08-18-2023
08:23 AM
2 Kudos
For information, jira ticket created. https://issues.apache.org/jira/browse/NIFI-11967
... View more
08-17-2023
06:07 AM
@sree21 You should be able to download the driver and use it anywhere. The license is just for hive endpoint itself. You can find that download page here: https://www.cloudera.com/downloads/connectors/hive/odbc/2-6-1.html
... View more
08-17-2023
04:49 AM
1 Kudo
Thanks. It was an issue with the connection string. It was pointing to another database while my processor was inserting into another database.
... View more
08-16-2023
06:20 AM
1 Kudo
@janvit04 The pattern you need is this: ${input_date:toDate("MM/dd/yyyy hh:mm:ss"):format("yyyy-MM-dd HH:mm:ss")} I did this in a test which you can find here. In this example I have UpdateAttribute with an input attribute called input_date and its string value is "8/6/2023 12:46 am". In next UpdateAttribute i do the toDate and format. With this setup you may need to modify the format in toDate function to match your input string until it gets right format. For example I thought it should be m/d/yyyy but i got right output using MM/dd/yyyy.
... View more