Archives of Support Questions (Read Only)

This is an archived board for historical reference. Information and links may no longer be available or relevant
Announcements
This board is archived and read-only for historical reference. To ask a new question, please post a new topic on the appropriate active board.

Today TFH refused to start with message of "org.apache.hadoop.ipc.RemoteException:The directory item limit of /tmp/hive/hive is exceeded: limit=1048576 items=1048576 "

avatar
 
1 ACCEPTED SOLUTION

avatar
Master Mentor
@mallikarjunarao m

Take a look on dfs.namenode.fs-limits.max-directory-items

fs.namenode.fs-limits.max-directory-items0Defines the maximum number of items that a directory may contain. A value of 0 will disable the check.

Take a look on this to cleanup /tmp

https://github.com/nmilford/clean-hadoop-tmp

View solution in original post

2 REPLIES 2

avatar
Master Mentor
@mallikarjunarao m

Take a look on dfs.namenode.fs-limits.max-directory-items

fs.namenode.fs-limits.max-directory-items0Defines the maximum number of items that a directory may contain. A value of 0 will disable the check.

Take a look on this to cleanup /tmp

https://github.com/nmilford/clean-hadoop-tmp

avatar
New Member

From hadoop 2.7.0, can't set value 0 to disable the check.

dfs.namenode.fs-limits.max-directory-items1048576Defines the maximum number of items that a directory may contain. Cannot set the property to a value less than 1 or more than 6400000.


http://hadoop.apache.org/docs/r2.7.0/hadoop-project-dist/hadoop-hdfs/hdfs-default.xml

It is generally recommended that users do not tune these values except in very unusual circumstances.