Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

nifi too many open files error

nifi too many open files error

New Contributor

I got error when using nifi processor. (Nifi version is 1.4)

I set 'limits.conf' like below.

* hard nofile 50000 
* soft nofile 50000 
* hard nproc 10000 
* soft nproc 10000

Nifi process gets stuck In 3~4 days after starting process.

nifi-app.log like below..

2018-03-26 01:57:51,690 INFO [Provenance Maintenance Thread-3] o.a.n.p.PersistentProvenanceRepository Created new Provenance Event Writers for events starting with ID 22310326
2018-03-26 01:57:51,823 INFO [Provenance Repository Rollover Thread-1] o.a.n.p.lucene.SimpleIndexManager Index Writer for /data001/nifi/provenance_repository/index-1521686514000 has been returned to Index Manager and is no longer in use. Closing Index Writer
2018-03-26 01:57:51,826 INFO [Provenance Repository Rollover Thread-1] o.a.n.p.PersistentProvenanceRepository Successfully merged 16 journal files (399 records) into single Provenance Log File /data001/nifi/provenance_repository/22309927.prov in 144 milliseconds
2018-03-26 01:57:51,827 INFO [Provenance Repository Rollover Thread-1] o.a.n.p.PersistentProvenanceRepository Successfully Rolled over Provenance Event file containing 331 records. In the past 5 minutes, 1526 events have been written to the Provenance Repository, totaling 1.84 MB
2018-03-26 01:58:09,477 INFO [pool-10-thread-1] o.a.n.c.r.WriteAheadFlowFileRepository Initiating checkpoint of FlowFile Repository
2018-03-26 01:58:09,527 ERROR [pool-10-thread-1] o.a.n.c.r.WriteAheadFlowFileRepository Unable to checkpoint FlowFile Repository due to java.io.FileNotFoundException: /data001/nifi/flowfile_repository/partition-227/1.journal (Too many open files)
java.io.FileNotFoundException: /data001/nifi/flowfile_repository/partition-227/1.journal (Too many open files)
        at java.io.FileOutputStream.open0(Native Method)
        at java.io.FileOutputStream.open(FileOutputStream.java:270)
        at java.io.FileOutputStream.<init>(FileOutputStream.java:213)
        at java.io.FileOutputStream.<init>(FileOutputStream.java:162)
        at org.wali.MinimalLockingWriteAheadLog$Partition.rollover(MinimalLockingWriteAheadLog.java:779)
        at org.wali.MinimalLockingWriteAheadLog.checkpoint(MinimalLockingWriteAheadLog.java:528)
        at org.apache.nifi.controller.repository.WriteAheadFlowFileRepository.checkpoint(WriteAheadFlowFileRepository.java:451)
        at org.apache.nifi.controller.repository.WriteAheadFlowFileRepository$1.run(WriteAheadFlowFileRepository.java:423)
        at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
        at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308)
        at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180)
        at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at java.lang.Thread.run(Thread.java:748)

I suspect 'Tailfile' processor, but I am not sure and struggling to find out how to solve it.

My 'Tailfile' processor like attached image..

Every 5 mins, Log files are written in base directory and I want to get changed log conents only.

The name of log files is like these...(yyyyMMdd is changed everyday)

'GSFS_APIC-yyyyMMdd-AAAAAAA.log'

'GSFS_APIC-yyyyMMdd-BBBBBBB.log'

...

'GSFS_APIC-yyyyMMdd-ZZZZZZZZ.log'

so, I set Tailfile property like this.

File(s) to Tail : GSFS_APIC-${now():format('yyyyMMdd')}.*\.log

Is there something wrong?

How can I solve this 'Too many open files' error.

Please help me...

In the mean time, openfile counts (lsof | wc -l) get 38628

I should restart nifi soon...

Thank you in advance.


tailfile.png
3 REPLIES 3

Re: nifi too many open files error

Mentor

@yc choi

To see the limits associated with your login, use the command ulimit -a. If you're running nifi as nifi user account.

Use the account under which nifi is running? To relieve yourself just make it unlimited follow these steps

Check user-specific process you can change by using the -u option

 ulimit -u

Change in the below I authorizing 2048 files

$ ulimit -u 2048

If you see an error like "-bash: ulimit: max user processes: cannot modify limit: Operation not permitted"Check current limit

$ ulimit -a

You have to switch user to root and edit your /etc/security/limits.conf or change to open files to unlimited

# ulimit -c unlimited 

Validate the change the desired output should be "unlimited"

# ulimit -c 
unlimited

Hope that helps

Re: nifi too many open files error

New Contributor

Thank you for the reply.

I checked 'ulimit -a' as 'nifi' user which is running NIFI process.

After lots of googling.. I found this issue (https://issues.apache.org/jira/browse/NIFI-5043)

It may be concened about that.

Re: nifi too many open files error

@yc choi

Depending on how many flow files are in your flow at any one time, even 50,000 file handles may not be enough.

Change the nofile values to 999999 and that should take care of your issue.