# Tutorial :Protect against accidental deletion

### Question:

Today I first saw the potential of a partial accidental deletion of a colleague's home directory (2 hours lost in a critical phase of a project). I was enough worried about it to start thinking of the problem ad a possible solution.

In his case a file named '~' somehow went into a test folder, which he after deleted with rm -rf... when rm arrived to the file bash expanded it to his home folder (he managed to CTRL-C almost in time). A similar problem could happen if one have a file named '*'.

My first thought was to prevent creation of files with "dangerous names", but that would still not solve the problem as mv or other corner case situations could lead to the risky situation as well.

Second thought was creating a listener (don't know if this is even possible) or an alias of rm that checks what files it processes and if it finds a dangerous one skips sending a message. Something similar to this:

1. take all non-parameter arguments (so to get the files one wants to delete)
2. cycle on these items
3. check if current item is equal to a dangerous item (say for example '~' or '*'), don't know if this works, at this point is the item already expanded or not?
4. if so echo a message, don't do anything on the file
5. proceed with iteration

Third thought: has anyone already done or dealed with this? :]

### Solution:1

There's actually pretty good justification for having critical files in your home directory checked into source control. As well as protecting against the situation you've just encountered it's nice being able to version control .bashrc, etc.

### Solution:2

Since the shell probably expands the parameter, you can't really catch 'dangerous' names like that.

You could alias 'rm -rf' to 'rm -rfi' (interactive), but that can be pretty tedious if you actually mean 'rm -rf *'.

You could alias 'rm' to 'mv $@$HOME/.thrash', and have a separate command to empty the thrash, but that might cause problems if you really mean to remove the files because of disk quotas or similar.

Or, you could just keep proper backups or use a file system that allows "undeletion".

### Solution:3

Accidents do happen. You only can reduce the impact of them.

Both version control (regular checkins) and backups are of vital importance here.

If I can't checkin (because it does not work yet), I backup to an USB stick.

And if the deadline aproaches, the backup frequency increases because Murphy strikes at the most inapropriate moment.

### Solution:10

I use this in my ~/.basrc

alias rm="rm -i"

rm prompts before deleting anything, and the alias can be circumvented either with the -f flag, or by escabing, e.g. \rm file

Degrades the problem yes; solves it no.

Note:If u also have question or solution just comment us below or mail us on toontricks1994@gmail.com
Previous
Next Post »