genbackupdata man page

genbackupdata — generate backup test data


genbackupdata [--chunk-size=SIZE] [--config=FILE] [--dump-config] [--dump-setting-names] [--generate-manpage=TEMPLATE] [-h] [--help] [--help-all] [--list-config-files] [--version] [--no-default-configs] [-cSIZE] [--create=SIZE] [--depth=DEPTH] [--dump-memory-profile=METHOD] [--file-size=SIZE] [--log=FILE] [--log-keep=N] [--log-level=LEVEL] [--log-max=SIZE] [--log-mode=MODE] [--max-files=MAX-FILES] [--memory-dump-interval=SECONDS] [--output=FILE] [--quiet] [--no-quiet] [--seed=SEED] [FILE]...


genbackupdata generates test data sets for performance testing of backup software. It creates a directory tree filled with files of different sizes. The total size and the distribution of sizes between small and big are configurable. The program can also modify an existing directory tree by creating new  files, and deleting, renaming, or modifying existing files.  This can be used to generate test data for successive generations of backups.

The program is deterministic: with a given set of parameters (and a given pre-existing directory tree), it always creates the same output. This way, it is possible to reproduce backup tests exactly, without having to distribute the potentially very large test sets.

The data set consists of plain files and directories. Files are either small text files or big binary files. Text files contain the "lorem ipsum" stanza, binary files contain randomly generated byte streams. The percentage of file data that is small text or big binary files can be set, as can the sizes of the respective file types.

Files and directories are named "fileXXXX" or "dirXXXX", where "XXXX" is a successive integer, separate successions for files and directories. There is an upper limit to how many files a directory may contain. After the file limit is reached, a new sub-directory is created. The first set of files go into the root directory of the test set.

You have to give one of the options --create, --delete, --rename, or --modify for the program to do anything. You can, however, give more than one of them, if DIR already exists. (Giving the same option more than once means that only the last instance is counted.) (DIR) is created if it doesn't exist already.



generate data in chunks of this size

-c, --create=SIZE

how much data to create (default: 0)


depth of directory tree


size of one file


fill in manual page TEMPLATE

-h, --help

show this help message and exit


max files/dirs per dir


write output to FILE, instead of standard output


do not report progress


opposite of --quiet


seed for random number generator


show program's version number and exit

Configuration files and settings


add FILE to config files


write out the entire current configuration


write out all names of settings and quit


show all options


list all possible config files


clear list of configuration files to read



write log entries to FILE (default is to not write log files at all); use "syslog" to log to system log, "stderr" to log to the standard error output, or "none" to disable logging


keep last N logs (10)


log at LEVEL, one of debug, info, warning, error, critical, fatal (default: debug)


rotate logs larger than SIZE, zero for never (default: 0)


set permissions of new log files to MODE (octal; default 0600)



make memory profiling dumps using METHOD, which is one of: none, simple, or meliae (default: simple)


make memory profiling dumps at least SECONDS apart


Create data for the first generation of a backup:

genbackupdata --create=10G testdir

Modify an existing set of backup data to create a new generation:

genbackupdata -c 5% -d 2% -m 5% -r 0.5% testdir

The above command can be run for each new generation.

Referenced By