Created 02-23-2017 06:00 AM
Hi guys,
How can I make a file undeletable in Hdfs. Any suggestions ?
Thanks in advance
Created 02-23-2017 08:40 PM
Created on 02-23-2017 06:09 AM - edited 02-23-2017 06:11 AM
Just put it under the user directory and set the permission just like you we do Linux fs .
Using hadoop fs shell command.
hadoop fs -chown
Usage: hadoop fs -chmod
In addition for backup We can configure HDFS Snapshots point in time file recovery .
https://hadoop.apache.org/docs/r2.7.2/hadoop-project-dist/hadoop-hdfs/HdfsSnapshots.html
Created 02-23-2017 06:28 AM
Created 02-23-2017 08:57 AM
You can use Access Control List (ACL) to protect your file in HDFS. Pls refer the below links
https://hortonworks.com/blog/hdfs-acls-fine-grained-permissions-hdfs-files-hadoop/
Created 02-23-2017 08:32 PM
Superuser(hdfs) can delete any file in hdfs. So all i need is to make an hdfs file that cannot be deleted by anyone even superuser like the way chattr command do in linux. With ACLs i cannot make a file undeletable for all users.
Thanks
Created 02-23-2017 08:40 PM
Created 02-23-2017 09:16 PM
Created 10-28-2019 10:40 AM
Since Hadoop 2.8, it is possible to make a directory protected and so all its files cannot be deleted, using : fs.protected.directories property.
From documentation:
"A comma-separated list of directories which cannot be deleted even by the superuser unless they are empty. This setting can be used to guard important system directories against accidental deletion due to administrator error."
It does not exactly answer the question but it is a possibility.