Support Questions

Find answers, ask questions, and share your expertise

Today TFH refused to start with message of "org.apache.hadoop.ipc.RemoteException:The directory item limit of /tmp/hive/hive is exceeded: limit=1048576 items=1048576 "

avatar
 
1 ACCEPTED SOLUTION

avatar
Master Mentor
@mallikarjunarao m

Take a look on dfs.namenode.fs-limits.max-directory-items

fs.namenode.fs-limits.max-directory-items0Defines the maximum number of items that a directory may contain. A value of 0 will disable the check.

Take a look on this to cleanup /tmp

https://github.com/nmilford/clean-hadoop-tmp

View solution in original post

2 REPLIES 2

avatar
Master Mentor
@mallikarjunarao m

Take a look on dfs.namenode.fs-limits.max-directory-items

fs.namenode.fs-limits.max-directory-items0Defines the maximum number of items that a directory may contain. A value of 0 will disable the check.

Take a look on this to cleanup /tmp

https://github.com/nmilford/clean-hadoop-tmp

avatar
New Contributor

From hadoop 2.7.0, can't set value 0 to disable the check.

dfs.namenode.fs-limits.max-directory-items1048576Defines the maximum number of items that a directory may contain. Cannot set the property to a value less than 1 or more than 6400000.


http://hadoop.apache.org/docs/r2.7.0/hadoop-project-dist/hadoop-hdfs/hdfs-default.xml

It is generally recommended that users do not tune these values except in very unusual circumstances.