Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

'Too many open files' error: /etc/security/limits.d/hive.conf is not applied to process

Highlighted

'Too many open files' error: /etc/security/limits.d/hive.conf is not applied to process

New Contributor

We set the hive_user_nofile_limit param to 32000 in the Ambari UI.

Indeed we see this value in /etc/security/limits.d/hive.conf

in /proc/<hive-pid>/limits I don't see this value. instead I see the original 4096 for soft and hard Max open files (after restart).

How do I apply this change to the hive process?

Also - I can see that all the open files are in /tmp/hive/operation_logs.. how necessary are these logs? Should I just bypass the above error by disabling its writing?

Thanks,

Michael

2 REPLIES 2

Re: 'Too many open files' error: /etc/security/limits.d/hive.conf is not applied to process

@Michael Belostoky: I have faced the same issue and this is the suggestion from SME. The issue can be caused by Ambari agent, so he advised me to make sure ambari-agent has been restarted.

If you haven't done, would you mind restarting agent, then restarting data nodes from Ambari again, then please check limit by using the following command:

cat /proc/`lsof -ti:8010`/limits | grep 'open files\|processes'

As a work around hope this article will help you:

https://community.hortonworks.com/articles/3332/datanode-does-not-pick-up-correct-ulimit-on-secure.h...

Re: 'Too many open files' error: /etc/security/limits.d/hive.conf is not applied to process

New Contributor

I have workaround for this issue

Add in file hadoop-env.sh line:

ulimit -n 32000

Don't have an account?
Coming from Hortonworks? Activate your account here