Just put it under the user directory and set the permission just like you we do Linux fs .
Using hadoop fs shell command.
hadoop fs -chown
Usage: hadoop fs -chmod
In addition for backup We can configure HDFS Snapshots point in time file recovery .
You can use Access Control List (ACL) to protect your file in HDFS. Pls refer the below links
Superuser(hdfs) can delete any file in hdfs. So all i need is to make an hdfs file that cannot be deleted by anyone even superuser like the way chattr command do in linux. With ACLs i cannot make a file undeletable for all users.
Since Hadoop 2.8, it is possible to make a directory protected and so all its files cannot be deleted, using : fs.protected.directories property.
"A comma-separated list of directories which cannot be deleted even by the superuser unless they are empty. This setting can be used to guard important system directories against accidental deletion due to administrator error."
It does not exactly answer the question but it is a possibility.