- Subscribe to RSS Feed
- Mark Question as New
- Mark Question as Read
- Float this Question for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page
Today TFH refused to start with message of "org.apache.hadoop.ipc.RemoteException:The directory item limit of /tmp/hive/hive is exceeded: limit=1048576 items=1048576 "
- Labels:
-
Apache Hive
Created ‎03-07-2016 11:07 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Created ‎03-07-2016 11:17 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Take a look on dfs.namenode.fs-limits.max-directory-items
fs.namenode.fs-limits.max-directory-items | 0 | Defines the maximum number of items that a directory may contain. A value of 0 will disable the check. |
Take a look on this to cleanup /tmp
Created ‎03-07-2016 11:17 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Take a look on dfs.namenode.fs-limits.max-directory-items
fs.namenode.fs-limits.max-directory-items | 0 | Defines the maximum number of items that a directory may contain. A value of 0 will disable the check. |
Take a look on this to cleanup /tmp
Created ‎10-11-2018 07:48 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
From hadoop 2.7.0, can't set value 0 to disable the check.
dfs.namenode.fs-limits.max-directory-items | 1048576 | Defines the maximum number of items that a directory may contain. Cannot set the property to a value less than 1 or more than 6400000. |
http://hadoop.apache.org/docs/r2.7.0/hadoop-project-dist/hadoop-hdfs/hdfs-default.xml
It is generally recommended that users do not tune these values except in very unusual circumstances.
