Agent for timely backup for all nsf

Hi All,

i have a query regarding backup, i want to take timely backup of all domino server nsf,s. so how could i do this, please help me.

thanks and regards

zubair a khan

Subject: Agent for timely backup for all nsf

AFAIK there’s 3 ways:

  1. Using a backup software that is capable of backing up a file that is currently open in the server

  2. Using a secondary domino backup server where you replicata data

  3. Poor man’s manual backup, where you close the server nightly to create backups by scheduled OS scripts

Each way have their good and bad sides… 1 = usually more expensive; 2 = “bad data” is replicated too; 3 = requires shutting down domino;

Here’s my “Poor man’s” method (in SLES9). Works while there’s not huge loads of data but plenty free disk space. And also you need to have the option to keep the domino server down for a moment…

/etc/crontab:

00 6 * * * root /etc/init.d/domino stop

30 6 * * * root /etc/init.d/domino start

05 6 * * 0 root /path/to/local_backup sun

05 6 * * 1 root /path/to/local_backup mon

05 6 * * 2 root /path/to/local_backup tue

05 6 * * 3 root /path/to/local_backup wed

05 6 * * 4 root /path/to/local_backup thu

05 6 * * 5 root /path/to/local_backup fri

05 6 * * 6 root /path/to/local_backup sat

35 6 * * 0 root /path/to/remote_backup sun

35 6 * * 1 root /path/to/remote_backup mon

35 6 * * 2 root /path/to/remote_backup tue

35 6 * * 3 root /path/to/remote_backup wed

35 6 * * 4 root /path/to/remote_backup thu

35 6 * * 5 root /path/to/remote_backup fri

35 6 * * 6 root /path/to/remote_backup sat

  • at 6am the server goes down

  • while it’s down I create daily backups (I keep backup history for 7 days in daily subfolders) (in .gz format)

  • domino goes back on at 6:30am which left me 30 minutes to create backups

  • starting at 6:35 I’ll start copying backed up files to a remote server (in effect I have two sets of backups, one in the domino server and one set in another remote server)

local_backup script:

echo “BACKUP ($1) - Local file copy starting at $(date)…” >> /path/to/backup.log

echo “$(date) begin names.nsf” >> /path/to/backup.log

gzip -c /local/notesdata/names.nsf > /path/to/$1/names.nsf.gz

echo “$(date) end names.nsf” >> /path/to/backup.log

… next database(s) same way as above …

echo “BACKUP ($1) - Local file copy finished at $(date)…” >> /path/to/backup.log

remote_backup:

echo “BACKUP ($1) - Remote file copy starting at $(date)…” >> /path/to/backup.log

rsync -e ssh /path/to/$1/*.gz user_name@host_dest:/path/to/$1/

echo “BACKUP ($1) - Remote file copy finished at $(date)…” >> /path/to/backup.log

  • your public ssh id needs to copied to the dest_host to allow moving files over ssh without prompting for username/password

So in short: server is taken down, required files are compressed to .gz format, server is put back on, compressed files are copied to another server over net.

Just an example. Prolly doesn’t suit to most admins…