scrub-files man page

scrub-files — securely erase files by filling with random data first.


scrub [options] paths...


This command is used to securely erase files. This is accomplished by filling the file with random data in pre-sized chunks. Multiple passes of random data may also be used. The pre-sized chunks are used to remove information about exact original file size. Other options include random renaming of the original file before deletion and the use of truncation to break down meta-data on what blocks in the file system were originally associated with a securely deleted file. This is specifically intended to make it harder to perform forensic analysis on securely erased files.


--blocksize size
Set the default block size (in 1 k increments) for scrub-files to use when writing random data. This effects both the final file length, which will be aligned to the specified size, and the way the truncate option decomposes files. The default is 1k.
Dereference and follow symlinks, erasing the target file.
The number of passes used when writing random data. The default is 1 pass.
If argument is a directory, recursively scan directory and any subdirectory contents as arguments.
Rename the file randomly before deletion to clear persistant inode data.
Decompose the file through truncation to break down file system page maps.
Display each file being processed to the console.
Outputs help screen for the user.


scrub-files was written by David Sugar <dyfet@gnutelephony.org>.

Reporting Bugs

Report bugs to bug-commoncpp@gnu.org or bugs@gnutelephony.org.


GNU uCommon GNU Telephony January 2010