Member since
03-10-2017
171
Posts
80
Kudos Received
32
Solutions
My Accepted Solutions
| Title | Views | Posted |
|---|---|---|
| 1838 | 08-12-2024 08:42 AM | |
| 3018 | 05-30-2024 04:11 AM | |
| 3811 | 05-29-2024 06:58 AM | |
| 2597 | 05-16-2024 05:05 AM | |
| 1950 | 04-23-2024 01:46 AM |
03-31-2022
10:10 PM
Hi @Tamiri, Has the reply helped resolve your issue? If so, please mark the appropriate reply as the solution, as it will make it easier for others to find the answer in the future.
... View more
03-24-2022
03:28 AM
Hi @krishna123, I have the same problem. Thank you for helping me
... View more
03-14-2022
10:08 PM
Hi @gaofzhan, For what reason are you copying the JDBC driver to the NiFi's lib folder? When you need to JDBC driver for connecting to the remote sources, you typically only need to copy the ojdbc8.jar file. If you copy all the files that come in the tarball downloaded from Oracle it can cause errors like the one you describe. Please remove the copied files from the lib folder, copy only the ojdbc8.jar one and try again. Cheers, André -- Was your question answered? Please take some time to click on "Accept as Solution" below this post. If you find a reply useful, say thanks by clicking on the thumbs up button.
... View more
02-28-2022
05:08 AM
1 Kudo
Few points when connecting to any DataBase from NiFi NiFi has Generic controller service named DBCPConnectionPool which used to create connection pool to given database and processors such as ExecuteSQL or PutSql will use this controller service to obtain connection and run the given queries. What is needed as minimum prerequisite to start with DBCPConnectionPool 1. Determine type of Database 2. Determine the JDBC URL syntax supported by Database 3. The Database client Driver which works outside of NiFi to connect to same Database needs to be placed on each NiFi nodes on given location which can be read user who is running NiFi as service All above needs to be provided by user , if Ucanaccess 5.0.1 client Driver works outside of NiFi then it should work with DBCPConnectionPool as well, Where you are getting this message "Given file does not exist" ? Is client driver is present on NiFi hosts and configured Database Driver Location is correct ? permissions? You mentioned There is a Cdata driver which works but that is a licensed product" , it works from where ? from NiFi:DBCPConnectionPool? Thank You.
... View more
01-24-2022
04:47 AM
1 Kudo
Same Command Arguments are you able to run as user nifi (or the user who is responsible to run nifi service )against path which is determine by fileslocation outside of nifi at command prompt ?
... View more
01-21-2022
03:15 AM
Not supported , as ListFTP or ListSFTP works on last modified timestamp to pick newly modified files since it ran before. So, if a file is added with older last modified timestamp than the one which Listftp already picked, then the file won't be picked with listftp logi, max or min file age property does not aline with current listing strategy.
... View more
01-21-2022
03:08 AM
In response to your queries : 1,do you konw how to add "Minimum File Age" property in listftp processor ? Ans : Not supported , as ListFTP or SFTP works on last modified timestamp to pick newly modified files since it ran before. So, if a file is added with older last modified timestamp than the one which Listftp already picked, then the file won't be picked with listftp logic. 2. listsftp can connect ftp server ? i try ,but failed Ans : Use ListFTP is FTP server is not secure. 3. What is the difference between listftp and listsftp? We have two type of FTP servers, FTP and SFTP , SFTP uses a secure channel to transfer files while FTP doesn't, thus NiFi has FTP and SFTP processors. To address your use case "file may be update anytime, so i need the "Minimum File Age" property like listsftp processor get" Ans : You need to change/update the logic on how files written at FTP server , If the same files getting updated/appended multiple time assuming write is not completed yet so try to rename the file with specific name pattern after append is completed and only list/fetch the files which matches with rename pattern using File Filter Regex settings.
... View more
01-19-2022
08:35 AM
Hi @ckumar - thanks for the reply. I think that screenshot lacked info may be due to my permission issue. Same issue has been posted by team mate here - https://community.cloudera.com/t5/Support-Questions/NiFi-Node-showing-2-nodes-and-not-respecting-node-down-fault/td-p/334221 Please take a look at the screenshot attachment there which has more info. And i can login from all 3 nodes ui, and in all nodes it shows up only 1 & 2.
... View more
01-17-2022
09:15 AM
Can you also tell me how to ask data from a fixed date (period), not everything that there are in the bucket, thanks)
... View more
01-14-2022
07:24 AM
@LejlaKM Sharing your dataflow design and processor component configurations may help get you more/better responses to your query. Things you will want to look at before and when you run this dataflow: 1. NiFi heap usage and general memory usage on the host 2. Disk I/O and Network I/O 3. NiFi Host CPU Utilization (If your flow consumes 100% of the CPU(s) during execution, this can lead to what you are observing. Does UI functionality return once copy is complete?) 4. Your dataflow design implementation including components used, configurations, concurrent tasks etc. While most use cases can be accomplished through dataflow implementations within NiFi, not all use cases are a good fit for NiFi. IN this case your description points at copying a large Table from One Oracle DB to another. You made not mention of any filtering, modifying, enhancing, etc being done to the Table data between this move which is where NiFi would fit in. If your use case is a straight forward copying from A to B, then NiFi may not be the best fit for this specific us case as it will introduce unnecessary overhead to the process. NiFi ingest content and writes it a content_repository and creates FlowFiles with attributes/metadata about the ingested data stored in a FlowFile_repository. Then it has to read that content as it writes ti back out to a destination. For simple copy operations where not intermediate manipulation or routing of the DB contents needs to be done, a tool that directly streams from DB A to DB B would likely be much faster. If you found this response assisted with your query, please take a moment to login and click on "Accept as Solution" below this post. Thank you, Matt
... View more
- « Previous
- Next »