Member since
01-27-2023
229
Posts
73
Kudos Received
45
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
1287 | 02-23-2024 01:14 AM | |
1589 | 01-26-2024 01:31 AM | |
1098 | 11-22-2023 12:28 AM | |
2675 | 11-22-2023 12:10 AM | |
2705 | 11-06-2023 12:44 AM |
05-09-2023
07:00 AM
@nuxeo-nifi, the processors you are referring to do not belong to any NiFi Version (Cloudera or Open-Source), meaning that they were built in house, specially for you and your project. In this case, you would need to speak to those who have developed those processors and identify the application logic. Once you have that, you can use PutMail to send email notifications and InvokeHTTP to do the other actions. I assume that from your processors you have has a failed connection queue, which might be linked to an PutMail Processor, in which you define whatever you want to be send as notification In case of no failures, you can link the success queue out of your nuxeo processor and into InvokeHTTP and perform the call you require. For that, make sure that all your certificates are in place and allow connection between the systems. Otherwise, you won't be able to use InvokeHTTP and you would have to find another solution, like a script.
... View more
05-09-2023
12:19 AM
1 Kudo
@nuxeo-nifi, I am not quite familiar with Nuxeo, but as far as I know, you could use the REST API to batch upload documents into the Nuxeo System. To achieve this, you could easily use InvokeHTTP to perform the REST API calls to your Nuxeo endpoint. Or you can develop a custom script and execute it within an ExecuteStreamCommand Processor. If it comes to the Nuxeo database (I do not know if this is necessary or what sort of DB you have configured), but you can use the PutSQL (or any DB related Processor) to save your data in the DB, assuming that you have the JDBC connection details configured into your DBConnectionPool
... View more
05-08-2023
03:01 AM
@Manimaran, For the future, it would really help if you could mention your NiFi Version and the Database you are using, because each version (NiFi and the DB) has different ways of working. Besides that, it would also help to know what processors you are using so that we could understand your flow and provide a personalized answer. As for your problem, without any other information, you could use an ExecuteStreamCommand in which you define an Python/Bash/Groovy (anything you want basically) which will call your stored procedure. You link your Processor which is saving the data into your database to ExecuteStrreamCommand using the success Queue and once the data is inserted into your database, the flowfile will go into your ExecuteStreamCommand and call your script, which will execute your stored procedure. In newer Versions of NiFi you could also try to call the procedure using PutSQL. Or you could further try ExecuteScript (have a look here: http://funnifi.blogspot.com/2016/04/sql-in-nifi-with-executescript.html). As for the Wait/Notify processor, as far as I know, there is nothing implemented to be used directly (out of the box) and you will have to use a combination of multiple processors to achieve this. A detailed answer to this would be: https://pierrevillard.com/2018/06/27/nifi-workflow-monitoring-wait-notify-pattern-with-split-and-merge/
... View more
05-04-2023
06:30 AM
You do not install the Cloudera version on your laptop 🙂 You need the Cloudera DataFlow for Public Cloud (CDF-PC), meaning that we are talking here about a license and some services. As @steven-matison already provided you with the perfect answer for your question, he might also be in the position to further assist you with everything you need to know about the Cloudera Data Flow and their Public Cloud. Unfortunately I am still learning about what Cloudera offers and how, so I am not the best one to answer your question. If you are going to use NiFi for some real data processing, I strongly recommend you to have a look to Cloudera Data Flow, as this will solve many issues and headaches 🙂
... View more
05-04-2023
12:57 AM
@ushasri, What do you mean with licensed version of NiFi? In all my experience with NiFi I never heard of licensed version of NiFi, as this is an open source tool. Are you talking maybe about the Cloudera Version, which is somehow different as it is part of the Cloudera Ecosystem? When it comes to the export, you select your entire canvas (or a process group / or a group of multiple processors), create a template out of it, go within NiFi's Menu - Templates and download the newly created template. Afterwards, you can import that template in your new NiFi Instance and start playing with it.
... View more
05-04-2023
12:44 AM
Assuming that you are running on Linux, you need to find your operating system logs. In most linux distribution, those logs are to be found in /var/log. Now, every sysadmin configures each server by your company rules and requirements so I suggest you speak with the team responsible for the linux server and ask them to provide you with the logs. In these logs, you might find out why you are receiving that error in the first place. Unfortunately, this problem is not really related to NiFi, but to your infrastructure or to how somebody uses your NiFi instance. Somebody is doing something and you need to find out who and what 😞
... View more
05-04-2023
12:33 AM
1 Kudo
@danielhg1285, While the solution provided by @SAMSAL seems to be better for you and more production ready, you could also try the below things. This might work if you are using a stable statement all the time and if are not restricted to see the exact INSERT Statement but rather see the values trying to be inserted. - Shortly after RetryFlowFile, you can add an AttributesToJSON processor and manually define all the columns which you want to insert in the Attributes List Property. Make sure that you use the attribute name from your FlowFile (sql.args.N.value) in your correct order and you set Destination = flowfile-content. In this way, you will generate a JSON File with all the columns and all the values which you have tried to insert but failed. - After AttributesToJSON, you can keep your PutFile to save your file locally on your machine, hence opening it whenever and wherever you want 🙂 PS: This is maybe not the best solution, due to the following reasons, but it will get you started on your track: - You will need to know how many columns you have to insert and each time a new column will be added you will have to modify your AttributesToJSON processor. - You will not get the exact SQL INSERT/UPDATE Statement, but a JSON File containing the column-value pair, which can easily be analyzed by anybody.
... View more
05-03-2023
10:08 AM
@srv1009, In my overall experience with NiFi, the problem you have reported only happened because of one of the following two reasons: - as you already mentioned, an abrupt NiFi shutdown --> which of course might cause your corrupted file. This can be solved easily, you stop doing such shutdowns 🙂 - the physical disk on which NiFi is located gets corrupted and implicitly it starts corrupting other files as well. This has an easy solution as well --> you test your hardware health and if necessary you change it. Nevertheless, the reason for why this happens is present either in the NiFi logs or the server logs 🙂
... View more
04-28-2023
06:56 AM
hi @Amit_barnwal, First of all, why are you using the java.arg.2and java.arg.3= with such a big difference? You know that Xms = initial memory allocation and Xmx = maximum memory allocation... and all of them refer to the HEAP memory. In addition, the minimum recommended size for your heap is 4GB. Have a look on the following Article, which provides you everything you need in terms of configuring the nifi cluster: https://community.cloudera.com/t5/Community-Articles/HDF-CFM-NIFI-Best-practices-for-setting-up-a-high/ta-p/244999 Now, regarding the fact that you NiFi Instance is eating a lot of RAM Memory, you need to know that most of the processors divide into two categories: RAM eating processors and CPU eating processors. If in your workflow you have many RAM eating processors it is normal that you eat a lot of the available RAM Memory. PS: it is not really recommended to assign so much memory to your heap 🙂
... View more
04-28-2023
02:21 AM
@Vas_R, How would you like to encrypt that data, as I see you want to do this on each column and each row? Have you something in mind or are you searching for something built in NiFi directly? If you want something directly in NiFi, I do not think that you will find anything that specific. You could try exporting your relevant data into attributes and use something like CryptographicHashAttribute Processor in order to use a hash algorithm on those attributes. Next, you can use an AttributesToCSV/AttributesToJSON Processor and generate a new FlowFile with the hashed data. If CSV or JSON is not the best format for you, you can add an extra ConvertRecord and transform your data in whatever format you want. But be careful as this solution will require many resources if you are playing with large amounts of data. Another solution would be to find an encryption algorithm and implement it within a Script. Add an ExecuteStreamCommand Processor, which will read the AVRO File, perform the encryption and write out the newly generated AVRO File.
... View more