Created 06-02-2022 09:41 AM
Hello,
due to a problem with a script, we have almost saturated the HDFS available space.
I suppose this is caused by temporary Hive files that have not been cleaned up due to the abnormal termination of the script.
I would like to check the /tmp/hive folder on HDFS but my user, that has administrative privileges, cannot access the /tmp/hive folder.
Is there a way to check and clean such folder?
Any help would be appreciated.
KRs,
Andrea
Created 06-07-2022 10:45 AM
Hi Andrea,
use hdfs keytab to clear them
NAME=hdfs;
KEYTAB=$(find /run/cloudera-scm-agent/process -name ${NAME}.keytab -path "*${NAME}-*" | sort | tail -n 1);
PRINCIPAL=$(klist -kt "$KEYTAB" | awk '{ print $4 }' | grep "^${NAME}" | head -n 1);
kinit -kt "${KEYTAB}" "${PRINCIPAL}"
Thanks,
Vinay
Created 06-02-2022 12:00 PM
Hi Andrea,
Is this a kerberized environment?
Best,
-JMP
Created 06-02-2022 02:51 PM
Hi JM, yes it's a kerberized environment.
KRs,
Andrea
Created 06-06-2022 02:51 PM
Created 06-07-2022 10:45 AM
Hi Andrea,
use hdfs keytab to clear them
NAME=hdfs;
KEYTAB=$(find /run/cloudera-scm-agent/process -name ${NAME}.keytab -path "*${NAME}-*" | sort | tail -n 1);
PRINCIPAL=$(klist -kt "$KEYTAB" | awk '{ print $4 }' | grep "^${NAME}" | head -n 1);
kinit -kt "${KEYTAB}" "${PRINCIPAL}"
Thanks,
Vinay