Support Questions

Find answers, ask questions, and share your expertise

how to resolve the error about /tmp/hive/hive is exceeded: limit=1048576 items=1048576

avatar


hi all

we have ambari cluster ( HDP version - 2.5.4 )

in the spark thrift log we can see the error about - /tmp/hive/hive is exceeded: limit=1048576 items=1048576

we try to delete the old files under /tmp/hive/hive , but there are a million of files and we cant delete them because

hdfs dfs -ls /tmp/hive/hive   

isn't return any output


any suggestion ? how to delete the old files in spite there are a million of files?

or any other solution/?



* for now spark thrift server isn't started successfully because this error , also hiveserver2 not started also

Caused by: java.lang.RuntimeException: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.hdfs.protocol.FSLimitException$MaxDirectoryItemsExceededException): The directory item limit of /tmp/hive/hive is exceeded: limit=1048576 items=1048576       at org.apache.hadoop.ipc.Server$Han
dler.run(Server.java:2347)


second

can we purge the files? by cron or other?


hdfs dfs -ls /tmp/hive/hive
Found 4 items
drwx------   - hive hdfs          0 2019-06-16 21:58 /tmp/hive/hive/2f95f6a5-76ad-487e-968c-1873264a3a9c
drwx------   - hive hdfs          0 2019-06-16 21:45 /tmp/hive/hive/368d201c-cedf-48dc-bbad-f13d6aed7016
drwx------   - hive hdfs          0 2019-06-16 21:58 /tmp/hive/hive/717fb013-535b-4279-a12e-4fc4261c4d68


Michael-Bronson
31 REPLIES 31

avatar
Master Mentor

@Michael Bronson

The error that you see in the Ambari UI while adding the "" seems to be due to some other Inconsistency in the data.

The configuration changes could not be validated for consistency due to an unknown error. Your changes have not been saved yet. Would you like to proceed and save the changes? 

.

Please check and share the complete Ambari Server.log after attempting to enabkle that property ./..


I am suspecting that the "Consistency Check Failure" might be some other inconsistency in your config.

avatar
Explorer

Limit ls to few, I mean hdfs dfs -ls /tmp/hive/hive/14*

The directory under is of zero bytes

drwx------ - hive hdfs 0 2017-09-04 17:10 /tmp/hive/hive/149e8d6a-ad2a-433e-87be-6cb5b27e2b7b/_tmp_space.db

Find out older one and start purging them manually till you get a breather !!!.

After that get permission to implement Automatic approach