Member since
08-08-2017
1652
Posts
30
Kudos Received
11
Solutions
My Accepted Solutions
| Title | Views | Posted |
|---|---|---|
| 1933 | 06-15-2020 05:23 AM | |
| 15562 | 01-30-2020 08:04 PM | |
| 2082 | 07-07-2019 09:06 PM | |
| 8133 | 01-27-2018 10:17 PM | |
| 4607 | 12-31-2017 10:12 PM |
06-16-2019
11:11 PM
so this settings ( hive.server2.clear.dangling.scratchdir=true ) supported by the hive version - 1.2.1.2.6 ?
... View more
06-16-2019
11:05 PM
by the way in ambari hive version is - 1.2.1.2.6
... View more
06-16-2019
11:01 PM
@dear Jay ok but what are the values that need to set for both parameters?
... View more
06-16-2019
10:59 PM
@dear Jay - you said - "I do not think that we should do it on our own" I agree but we not have a choice because under /tmp/hive/hive we have a millions of folders and we cant delete them . so after we delete the folder from hdfs , we seen that after hive restart it create again the /tmp/hive/hive folder , do you have some advice what need to check after this brutal action ?
... View more
06-16-2019
10:53 PM
@dear jay ( hive.server2.clear.dangling.scratchdir and hive.start.cleanup.scratchdir ) are not configured in ambari from HIVE --> CONFIG , do you recommended to add them? , if yes then under advanced on which section we need to add them and what is the value for both parameters?
... View more
06-16-2019
10:48 PM
for your info - actually we already delete this folder before you post your answer , and after we restart the hive service in ambari , it create again the /tmp/hive/hive folder
... View more
06-16-2019
10:47 PM
second is it safe to delete the folder - hdfs dfs -rm -r /tmp/hive/hive
... View more
06-16-2019
10:46 PM
@dear jay - what is the meaning of hive.scratchdir.lock when is set to false?
... View more
06-16-2019
08:56 PM
hi all we have ambari cluster ( HDP version - 2.5.4 ) in the spark thrift log we can see the error about - /tmp/hive/hive is exceeded: limit=1048576 items=1048576 we try to delete the old files under /tmp/hive/hive , but there are a million of files and we cant delete them because hdfs dfs -ls /tmp/hive/hive isn't return any output any suggestion ? how to delete the old files in spite there are a million of files? or any other solution/? * for now spark thrift server isn't started successfully because this error , also hiveserver2 not started also Caused by: java.lang.RuntimeException: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.hdfs.protocol.FSLimitException$MaxDirectoryItemsExceededException): The directory item limit of /tmp/hive/hive is exceeded: limit=1048576 items=1048576 at org.apache.hadoop.ipc.Server$Han
dler.run(Server.java:2347) second can we purge the files? by cron or other? hdfs dfs -ls /tmp/hive/hive
Found 4 items
drwx------ - hive hdfs 0 2019-06-16 21:58 /tmp/hive/hive/2f95f6a5-76ad-487e-968c-1873264a3a9c
drwx------ - hive hdfs 0 2019-06-16 21:45 /tmp/hive/hive/368d201c-cedf-48dc-bbad-f13d6aed7016
drwx------ - hive hdfs 0 2019-06-16 21:58 /tmp/hive/hive/717fb013-535b-4279-a12e-4fc4261c4d68
... View more
Labels:
- Labels:
-
Apache Ambari
-
Apache Hadoop
-
Apache Hive
06-16-2019
03:51 PM
from - https://stackoverflow.com/questions/44235019/delete-files-older-than-10days-on-hdfs
... View more