Member since
08-11-2017
51
Posts
2
Kudos Received
0
Solutions
07-16-2018
02:48 PM
Try changing to forcefully use TLS v1.2 on all agent's ambari property file(ambari.properties) and see if that helps resolving the issue. Add following line under [security] section in "ambari.properties" of all agents. [security] force_https_protocol=PROTOCOL_TLSv1_2
... View more
06-04-2018
09:03 PM
1 Kudo
Thanks Shu for your timely reply. I made it to work! I used CSVRecordSetWriter property "Record Separator" as ",". With this change, cache data looked like "7000,7001,7003,". Then after fetching data from cache, I had to use an expression to remove the trailing "," from filter query. It was giving me an error. Finally working filter Flow query is: SELECT * FROM FLOWFILE where VBEOID in (${KBA_OID_cache:substring(0,${KBA_OID_cache:lastIndexOf(',')})}) Is there any size limit of how many data I can keep this way in cache?
... View more
06-02-2018
03:20 AM
Also, the flow works with "SELECT * FROM FLOWFILE where VBEOID in (7000,7001,7003)" The flow shows error with "SELECT * FROM FLOWFILE where VBEOID in (${KBA_OID_cache})" I think I'm doing some incorrect configurations in PutDistributeMapCache and / or FetchDistributedMapCache .
... View more
06-02-2018
03:14 AM
The input csv file used to for loading data to "PutdatabseRecord" is attached.(sample.txt). I executed a "QueryDatabaseTable" to get data for loading to "PutDistributedMapCache".
... View more
06-02-2018
01:58 AM
Thank you, Geoffrey. This solution worked.
... View more
06-02-2018
01:54 AM
Thank you, Shu for your reply. I tried the method, but I'm finding an error. I'm not sure how to solve it. I used QueryDatabaseTable to query a database table column and loaded data to PutDistributedMapCache. Then fetched cached data using "FetchDistributedMapCache" and then used loaded attribute in QueryRecord. QueryRecord is showing an error. The error is attached.
... View more
05-25-2018
04:21 PM
Hi Geoffey, Thanks for the reply, The server has current limit of 3262161(ie. over 1M). I think I have an incorrect flow file in NIFI caused this large number of open file situation. Can I create a new directory "content_repositoryX", and change below NIFI property to point the new directory and start NIFI? will it help to start NIFI? nifi.content.repository.directory.default=./content_repositoryX
... View more
05-25-2018
03:53 PM
I'm finding a situation where I'm unable to restart the NIFI instance. nifi-app.log error provided below. I noticed so many folders under content_repository/. The latest folder is named "1", I noticed 5 larger files in it. Is there anything I can change in NIFI property file to expire contents of the "content_repository", so that NIFI will start. 2018-05-25 06:57:12,805 WARN [Cleanup Archive for default] o.a.n.c.repository.FileSystemRepository Failed to cleanup archived files in /appl/nifi/nifi-1.5.0/content_repository/190/archive due to java.nio.file.FileSystemException: /appl/nifi/nifi-1.5.0/content_repository/190/archive: Too many open files
2018-05-25 06:57:12,805 WARN [Cleanup Archive for default] o.a.n.c.repository.FileSystemRepository Failed to cleanup archived files in /appl/nifi/nifi-1.5.0/content_repository/191/archive due to java.nio.file.FileSystemException: /appl/nifi/nifi-1.5.0/content_repository/191/archive: Too many open files
2018-05-25 06:57:12,806 WARN [Cleanup Archive for default] o.a.n.c.repository.FileSystemRepository Failed to cleanup archived files in /appl/nifi/nifi-1.5.0/content_repository/192/archive due to java.nio.file.FileSystemException: /appl/nifi/nifi-1.5.0/content_repository/192/archive: Too many open files
/content_repository has many sub-folders from 1-197, attached below latest few:
-------------------------------------------------------------------------------
drwxr-xr-x 3 user tp 21 May 11 12:37 8
drwxr-xr-x 3 user tp 21 May 11 12:37 6
drwxr-xr-x 3 user tp 21 May 11 12:37 5
drwxr-xr-x 3 user tp 21 May 11 12:37 4
drwxr-xr-x 3 user tp 21 May 11 12:37 3
drwxr-xr-x 3 user tp 21 May 11 12:37 7
drwxr-xr-x 3 user tp 21 May 14 06:55 9
drwxr-xr-x 3 user tp 21 May 14 06:55 10
drwxr-xr-x 3 user tp 136 May 24 06:55 1
drwxr-xr-x 3 user tp 44 May 24 08:39 2
Directory 1 - content:
----------------------
total 476
drwxr-xr-x 3 user tp 136 May 24 06:55 .
drwxr-xr-x 1026 user tp 20480 Jan 22 19:47 ..
-rw-r--r-- 1 root root 1637 May 15 15:52 1526411432690-1
-rw-r--r-- 1 root root 38369 May 17 14:32 1526558069151-1
-rw-r--r-- 1 root root 48687 May 22 11:45 1526990105222-1
-rw-r--r-- 1 root root 589 May 23 14:33 1527076500470-1
-rw-r--r-- 1 root root 341020 May 24 09:03 1527162955688-1
drwxr-xr-x 2 user tpxes 6 May 22 06:55 archive
... View more
Labels:
- Labels:
-
Apache NiFi
05-23-2018
07:52 PM
1 Kudo
I'm creating a NIFI flow to read CSV file data and load it an Relational database. I used QueryRecord Processor to read CSV/Convert-toJSON/ Filter Flow file data using some parameters. Everything works perfectly, data loaded to database based on the filter criteria I added to the flow Query. The flow Query looks like "SELECT * FROM FLOWFILE where VBEOID in (7000,7001,7003)". In this query, I have added only 3 filter parameters on its "IN" clause. But in real life situation, there will be 1000's of entries to be added to this filter criteria. I would like to read these values from a database table or call a REST service to get data, and substitute values in place of hard-coded the value. Is there a way I can do it?
... View more
Labels:
- Labels:
-
Apache NiFi
05-11-2018
05:03 PM
Thank you, Shu. This worked perfectly, thanks for the additional info as well.
... View more