Member since
01-27-2023
220
Posts
61
Kudos Received
42
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
76 | 11-22-2023 12:10 AM | |
178 | 11-06-2023 12:44 AM | |
280 | 11-02-2023 02:01 AM | |
374 | 10-18-2023 11:37 PM | |
329 | 10-09-2023 12:36 AM |
03-13-2023
02:49 AM
Without knowing what you have configured, I will try and assume some of your configurations: - based on your installation, I assume that you are using HTTP and not HTTPS, right? If I take a look in your screenshot, I can see that you are running on port 8443, which is the default port configured within nifi.web.https.port. I assume that you wrote the host and the port in the nifi.web.http.host and the nifi.web.http.port properties and not the HTTPS properties, right? If not, this is what might be causing your issue. - besides that, have you check the logs for any errors and so on? I am talking here about the bootstrap logs (nifi-bootstrap.log) and the application logs (nifi-app.log). If everything starts normally, within the application logs you should see the link where NiFi is available. Otherwise, there is an error and you have to take a look. PS: it would help if you could attach the nifi.properties here, with all the confidential information written as standard placeholders (like <hostname>, <password> , etc)
... View more
03-13-2023
02:38 AM
can you maybe post the statement you are running in mongo + the result and afterwards the query you execute in NiFi and the entire error message you receive? Right now it seems to be an error regarding the format of the date/query. I am not that familiar with mongoDB but maybe in the meantime somebody with more experience will intervene.
... View more
Re: On a gaming laptop, Apache NiFi can be launche...
03-10-2023
05:38 AM
03-10-2023
05:38 AM
Hi @judywatson , the answer pretty much depends on the actions you are going to do in NiFi. The easy answer is yes, you can run NiFi on that gaming laptop, but don't expect to be able to run complex end-to-end flows :). For example I have used a small linux server, with 8GB of RAM and a CPU with 4 cores to handle a couple of my flows (like extracting data out of a couple of DBs, transferring files from a SFTP and into AWS, converting some data CSV-AVRO, PARQUET-AVRO). I assigned 4GB to HEAP and left it to run and I had no issues with it. But again, it mostly depends on what you are trying to achieve and the size of the data you are trying to ExtractTransformLoad 🙂
... View more
03-10-2023
05:08 AM
Why does it not work in NiFi? What sort of an error message do you receive? NiFi basically executes what you set in the query property. If you can execute it in mongo, it should work from NiFi as well.
... View more
03-10-2023
03:04 AM
maybe you can try with something like? I have no access to any MongoDBs right now, to test it myself 😞 {"TransactionHistory": {"$gt": "<your_value_here>"}}
... View more
03-10-2023
02:50 AM
I see that you are using something which is not default nor belonging to NiFi. I would suggest you have a look in your JAR Files from PROD and see if you can find something which might point to something like batchiq. Most likely the JAR file is missing from your dev environment.
... View more
03-09-2023
11:48 PM
1 Kudo
@anoop89I never used file-identity-provider so I am not really experienced with that 😞 would it be possible to provide a short snippet from conf/login-credentials/xml? You can remove all the PII data and replace them with something dummy, but I would really like to see how the file is structured and try to reproduce the behavior on my local device. Was this file generated automatically or have you created manually and kept on using on your prod/dev instances? PS: are you using the NiFi Cloudera version?
... View more
03-09-2023
11:21 PM
Hi @deepak123 , what do you mean by Nifi is performing slow? Based on your question, it is not very clear in which point you encountering performance issue. When waiting for an answer from your InvokeHTTP endpoint or when doing some actions on the result of the API call?
... View more
03-09-2023
11:15 PM
1 Kudo
hi @anoop89 I can confirm you that version 1.19.1, 1.20.1 work very well without ldap or kerberos. I have installed two clusters, one in which there is no security active (cluster not secure) and one in which I have only activated the login with a single user and password. But here I think it mostly depends on the version (the open-source, the Cloudera version, etc) you are ussing. What I can tell from your logs is that you might have defined a false class for your login identify provider. By default, when I have unzipped the NiFi ZIP File, the nifi.properties file contained the following lines: nifi.login.identity.provider.configuration.file=./conf/login-identity-providers.xml
nifi.security.user.login.identity.provider=single-user-provider The login-identify-providers.xml is defined as seen below, but you have two other options which are commented: LDAP(<identifier>ldap-provider</identifier>) and KERBEROS (<identifier>kerberos-provider</identifier>) <provider>
<identifier>single-user-provider</identifier>
<class>org.apache.nifi.authentication.single.user.SingleUserLoginIdentityProvider</class>
<property name="Username"/>
<property name="Password"/>
</provider> Maybe you are trying to use the option file-provider from within the authorizers.xml file, which comes by default as commented and it is not recognized when starting NiFi? I think that your best solution here would be to compare the configuration files from your Dev Environment with the configuration files from your PROD Environment. By doing that you will identify where you defined the wrong property and you can correct it straight away.
... View more
03-09-2023
10:45 PM
hi @moahmedhassaan, it would really help if would provide more details about your flow, even how you query your data. Having and testing a MongoDB is not easy for everybody because there are not lots of people who have it available. Nevertheless, there might a solution, not very efficient but it might do your thing: Add a GenerateFlowFile Processor, where you configure a property like : ${now():format("yyyy-MM-dd HH:mm")}:${now():format("ss"):minus(1)}. Set this processor to run only on the primary node so you won't have to many generated files. Send the success queue to your GetMongo Processor. Within the GetMongo Processor, in the query property you write your query with the condition on transactiondate > The_property_defined_in_Generate_Flow_File. Again this is not a very bright solution, but it could suite your needs until somebody with more experience can provide you with a solution 😊
... View more
- « Previous
- Next »