Member since
11-17-2021
1081
Posts
249
Kudos Received
25
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
155 | 08-04-2025 04:17 PM | |
327 | 06-03-2025 11:02 AM | |
1021 | 12-13-2024 07:54 AM | |
677 | 11-15-2024 12:41 PM | |
1488 | 10-14-2024 02:54 PM |
09-17-2025
05:15 AM
Hello @Malrashed 1. How much delay are you observing for queries to appear in CM > Impala > Queries page when queries are executed from Hue? 2. Do you observe the same delay when running queries on Impala-shell?
... View more
09-16-2025
01:22 AM
Just to add even more input on this, I can see a side effect of it shown in the HBCK report under Inconsistent Regions Just to add more input on this, in the HBase master logs I can also see: <2025-09-16T10:11:27.734+0200> <WARN> <janitor.ReportMakingVisitor>: <INCONSISTENCY: Row name is not equal to serialized info:regioninfo content; row=XXX:XXX,e14ef354e1644a6a2b229ffde05f38d3 XXX:XXX,test,12345.07a983faf71bc641bfbb4e8736d378ac.; See if RegionInfo is referenced in another hbase:meta row? Delete?> In meta: XXX:XXX,e14ef354e1644a6a2b229ffde05f38d3 column=info:regioninfo, timestamp=2025-08-26T12:12:21.389, value={ENCODED => 07a983faf71bc641bfbb4e8736d378ac, NAME => 'XXX:XXX,test,12345.07a983faf71bc641bfbb4e8736d378ac.', STARTKEY => 'test', ENDKEY => ''}
... View more
09-12-2025
11:41 AM
@Alexm__ While i have never done anything myself with Azure DevOps pipelines, I don't see why this would not be possible. Dev, test, prod environments would likely have slight variations in NiFi configurations (source and target service URLs, passwords/usernames, etc). So when designing your Process Group dataflows you'll want to take that into account and utilize NiFi's Parameter contexts to define such variable value configuration properties. Sensitive properties (passwords) are never passed to NiFi-Registry. So any version controlled PG imported to another NiFi will not have the passwords set. Once you version control that PG, you can deploy it through rest-api calls to other NiFi deployments. First time it is deployed it will simply import the parameter context used in source (dev) environment. You would need to modify that parameter context in test, and prod environments to set passwords and alter any other parameters as needed by each unique env. Once the modified parameter context of same name exists in the other environments, promoting new versions of dataflows that use that parameter context becomes very easy. The updated dataflows will continue to use the local env parameter context values rather then those used in dev. If a new parameter is introduced to the parameter context, is simply gets added to the existing parameter context of the same name in test and prod envs. So there will be some consideration in your automated promotion of version controlled dataflows between environments to consider. Versioning a DataFlow Parameters in Versioned Flows Please help our community grow. If you found any of the suggestions/solutions provided helped you with solving your issue or answering your question, please take a moment to login and click "Accept as Solution" on one or more of them that helped. Thank you, Matt
... View more
09-09-2025
03:31 PM
@venkatsambath @abdulpasithali @upadhyayk04 Hi! Do you have some insights here? Thanks!
... View more
09-09-2025
11:36 AM
Hi @DianaTorres, Thanks for your reply. I resolved this issue by modifying the networking. Best regards, Shubham Rai.
... View more
09-04-2025
04:54 PM
Here are some highlights from the month of August
WEBINAR
Introducing AI in a Box for Key Industries
Register Now
VIRTUAL EVENT
The latest innovations in data, analytics & AI
October 15, 2025
8:00 AM PT | 11:00 AM ET | 4:00 PM GMT | 5:00 PM CEST
Register Now
Check out the FY25 Cloudera Meetup Events Calendar for upcoming & past event details!
1360 members
5 new articles
70 questions
We would like to recognize the below community members and employees for their efforts over the last month to provide community solutions.
See all our top participants at Top Solution Authors leaderboard and all the other leaderboards on our Leaderboards and Badges page.
@upadhyayk04 @MattWho @Gopinath @jagadeesan @rsanchez @RamaClouder @mslnrd @Alf015 @SAMSAL @Rah59
Share your expertise and answer some of the below open questions. Also, be sure to bookmark the unanswered question page to find additional open questions.
Unanswered Community Post
Components/ Labels
I'm using Apache NiFi 2.x with Python-based custom processors. I have two different PythonProcessor scripts (in /python/extensions) with different logic. However, NiFi always runs only the first script's logic, even when I configure the second script in a different processor.
Apache NiFi
nifi-env.sh file is empty in 2.4.0. Upgrade issue in EKS
Apache NiFi
Error generating aggregated logs for Spark Applications on Cloudera CDP 7.2.18
Apache Spark Cloudera Data Platform (CDP)
Issue with JoinEnrichment Processor
Apache NiFi
Issue in upgrading nifi from 2.0.0 M4 to 2.4.0
Apache NiFi
The following employees published new public-facing community articles during this month.
Community Article
Author
Components/ Labels
A Practical Guide to Fine-Tuning Language Models with GRPO
@K_Pamulaparthy
Cloudera Data Science and Engineering
Cloudera Data Science Workbench (CDSW)
Cloudera Machine Learning (CML)
Understanding Reasoning Models with GRPO: A Conceptual Introduction for Building your own Medical Reasoning Model
CDP for AWS DNS Configuration
@Dongkai-Yu
Cloudera Data Engineering (CDE)
Cloudera Data Platform (CDP)
Cloudera Data Warehouse (CDW)
Cloudera DataFlow (CDF)
Using HiveWareHouse Connector (HWC) with Cloudera DataEngineering
@abjain
Cloudera Data Engineering (CDE)
Cloudera Data Platform (CDP)
... View more
09-04-2025
05:27 AM
hi @huimin, perfect! if you could describe the solution you implemented, it will help others who encounter the same issue. hugs.
... View more
09-03-2025
10:38 AM
@Virt_Apatt I don't know enough about your use case to make any other suggestions. All I know is that your user(s) supply some custom date that you have NiFi add 10 days to before running a Oracle query to get some result set returned to NiFi. NiFi is typically used to build dataflows that are always in the running state, so users do not need to continuously stop, modify component(s), and start a dataflow/component. What is the significance of this "custom date" that starts your dataflow? Is there any pattern to these custom dates? Can the next custom date be derived from the response from the previous Oracle query? How often does this dataflow get executed? Just some examples (there are many NiFi processor components that can fetch content from external sources): You could start your dataflow with a getSFTP or getFile processor that is checks a specific source SFTP server or local directory for a specific filename. In that file is your custom date. You then build your dataflow to extract that custom date from the consumed file to then execute your oracle query. This way your NiFi is always running and just waiting for the next file to show up on the SFTP server or in that local directory it keeps checking. Or maybe setup an http lister (ListenHTTP or HandleHTTPRequest) that listens for an http post that contains the custom date needed for your running dataflow. Please help our community grow. If you found any of the suggestions/solutions provided helped you with solving your issue or answering your question, please take a moment to login and click "Accept as Solution" on one or more of them that helped. Thank you, Matt
... View more
09-01-2025
05:59 AM
Hello dear support team, I’m experiencing the same issue as the original poster and others in the thread. Could you please assist me in updating the email address associated with my account? Thank you very much !
... View more
08-25-2025
12:57 PM
@GKHN_ As I described in my first response, Authentication and Authorization are two different processes. So it sounds like from your comment that authentication is working fine for both your users and authorization is failing for your non admin user. So issue is within the authorization phase. I assume both of your users are authenticating via ldap? In your ldap-provider in the login-identity-providers.xml you have the "Identity Strategy" set to "USE_DN". With this setting you the users full ldap DN will be used as the user identity string after successful authentication. This means that entire DN is being passed to the authorizer to lookup if that full dn has been authorized to the requested end-point NiFi policy. I see you have your initial admin identity manually defined in the file-user-group-provider and the file-access-policy provider: CN=NIFIUSER,OU=Userpro,OU=CUsers,OU=Company,DC=company,DC=entp So when you login via ldap with this user's ldap username and ldap password, the user's entire DN is being passed to the authorizer and the file-access-policy provider has setup all admin related NiFi policies for this initial admin user identity. I also see from the shared authorizers.xml that the only user-group-provider the "file-access-policy provider" is configured to use is the "file-user-group-provider". The file-user-group-provider requires the admin user to manually add additional user identities manually from the with the NiFi UI (Remember that with your current ldap-provider login provider, all your ldap user identities are going to be full DNs). As the admin user, go to the NiFi global menu and select "USERS": From the NiFi Users UI, select the "+" to add a new user: Then enter the full DN for your second user (Case sensitive). unless you have added any groups, your list of groups will be blank. Now that you have added this second user identity, you'll need to start authorizing that user identities for the various policy they need. In order to access the NiFi UI, all users must be authorized to "view the user interface". From the same NiFi Global menu mentioned above, select "Policies" this time. Then from the "Access Policies" UI that appears, select "view the user interface" from the policy list pull-down. Then click on the icon to the right that looks like a person with a "+". Find the user identity you just added and check the box and click the "Add" button. Now this user can access the NIFi UI. There are other policies this user will need before they can start building dataflows on the UI. NiFi allows for very granular authorizations. But at the minimum the user will need to be authorized on the process group in which they will build their dataflows. Not all policies are defined from the "Access Policies" UI in the global menu. the component level policies are define directly via the individual component (keep an eye out for the "key" icon) From the "Operation" panel directly on the NiFi canvas you can set policies on the currently selected component: Above I have selected my root Process Group (PG). If you click the key icon you will see all the access policies that users can be authorized for. You'll need to select each one by one your user will need and add the user to them. Above will allow you to setup access for your additional users using the file-user-group-provider you have configured in your authorizers.xml. Please help our community grow. If you found any of the suggestions/solutions provided helped you with solving your issue or answering your question, please take a moment to login and click "Accept as Solution" on one or more of them that helped. Thank you, Matt
... View more