Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

HiveServer2 Hive user's nofile ulimit above 64000?

avatar
Rising Star

By default, hive's ullimit (managed through ambari) is 32000. We reached that limit last week on our hiveserver2 server and decided to increase this value in ambari to 64000. We just hit the 64k nofile ulimit. This leads me to believe that hiveserver2 is not cleaning up connections like it should and files are not being released.

Has anyone else experienced this issue? Any suggestions on what to check?

What do you have your hive user's nofile limit set too?

# lsof -u hive | wc -l

64450

We are on HDP 2.4.2, Ambari 2.2.2

Should hive really have that many files open?

Update:

We're approaching the 64k nofile ulimit setting again for the hive user.

# lsof -u hive | wc -l

57090

After digging through the output of lsof, I see a lot of temporary operation_logs.

/tmp/hive/operation_logs/658c3930-8975-47db-ad7f-7cbef6279b11/acc2043a-d3bb-4a8c-9a7d-d0b743b9ce5d

Here is the total number of open operation_logs files open right now.

# lsof -u hive | grep operation_logs | wc -l

56102

These files are 3 to 4 days old.

10 REPLIES 10

avatar
New Contributor

My comment - on Hadoop 2.4 you must to change limit for max open files with Ambari (hive -> advanced hive-env hive_user_nofile_limit = 64000)