The Domino backup system unifies all Domino user data (Workbench and Domino app configuration) into a single cron
job, producing a single tarball backup. This cron job is called domino-workbench-backup
.
The essential information necessary to recover Domino projects and data is stored in three systems, each of which has its own backup process. Deployments in the cloud will have these systems automatically backed up to AWS Simple Storage Service (S3) at least once per day.
The three storage systems are:
-
Blob Store: stores the files users upload or add to their projects
-
Git Server: stores the revision history of all projects
-
MongoDB: stores user accounts, run history, comments, environments, and configuration
See below for more information on each service, plus instructions on how to perform manual backups for on-premises deployments.
The easiest way to manually trigger a backup of an entire Domino deployment that has access to AWS, is with the domino-deployment
command line interface. This command line tool is likely available on an instance somewhere in your deployment. Ask Domino Support or your Customer Success Engineer if you do not know where to find it. To trigger a backup of your deployment, run:
bin/domino-deployment backup
This will trigger backups of MongoDB and Git to S3, and it will create Amazon Machine Image (AMI) snapshots of the host servers in your deployment.
The blob store contains the content of the files that users have in their projects. This is distinct from the Git server, which organizes those projects into revisions.
Manual on-premises backups
For on-premises installations, the blob store is a directory on disk, either on a network (NFS) drive, or on the local drive of a single server installation. Its size depends on the project data in Domino and can be up to many TBs. It can be backed up without downtime with the following procedure:
-
Determine the location of the blob store on disk:
-
Sign in to the Domino web interface as a user with administrative privileges.
-
Click your username on the top right, then click Admin.
-
Click Advanced, then click Central Config.
-
Look for the line with key
com.cerebro.domino.blobStorageMedium
. Ensure that its value isFileSystem
. -
Look for the line with key
com.cerebro.domino.blobFileRoot
. Its value will be the path to the blob store, e.g./domino/blobs
.
-
-
SSH into the server hosting the Domino web interface. This will be the address you use to load the Domino application in your browser. If you do not have access to the private SSH key for this deployment, contact your administrator.
-
On this host, the blob store directory (e.g.
/domino/blobs
per the example above) is the directory that you need to back up to preserve all project files. The blob store can get very large in an active Domino deployment, so setting up an incremental backup process where only new files are copied is recommended. Once created, files in the blob store are never modified or deleted by Domino.
The Git server contains the revision history of all projects. It tracks changes to files and is used to reconstruct your project at a particular state when you sync or browse files in the UI. It contains only metadata, in the form of links to files that are stored in the blob store.
Git server cloud backups
For AWS deployments, Git server data is automatically backed up daily and uploaded to S3. The most recent few days of backup are also available on the central server under /domino/backup/git
. To confirm backups are being generated, SSH into the server hosting the Domino web interface, and run ls /domino/backup/git/
. You should see output like this:
ls /domino/backup/git/
20171112-0651.tar.gz 20171115-0650.tar.gz 20171118-0637.tar.gz
20171113-0649.tar.gz 20171116-0647.tar.gz 20171119-0645.tar.gz
20171114-0630.tar.gz 20171117-0642.tar.gz 20171120-0628.tar.gz
Git server manual on-premises backups
On-premises deployments will need to set up their own backups for the Git server. Backing up the Git server involves backing up the directory which contains all of its Git repositories. Depending on the number of projects and revisions, this directory can be larger than 100MB. It can be backed up without downtime with the following procedure:
-
Determine the Git server address:
-
Sign in to Domino’s web interface as a user with administrative privileges.
-
Click your username on the top right, then click Admin.
-
Click Advanced, then click Central Config.
-
Look for the line with key
com.cerebro.domino.internalGitRepoHost
. Its value should be something likehttp://10.0.13.163:9001
. The server address in this example would be10.0.13.163
. Record this address for use in the next step.
-
-
SSH into the Git server using the address from the previous step.
-
Navigate to
/domino/git
. Within it there should be a directory calledprojectrepos
. If/domino/git
does not exist, search forprojectrepos
system-wide by running:find / -name projectrepos
. -
This is the directory that you need to back up. You can compress it to a single file by running:
tar czf domino_git_backup.tar.gz projectrepos
MongoDB contains user account information, run history and textual output (stdout), comments, environments, and system configuration. In addition to user data and other app-related functionality, it contains a “projects” collection that links each project in the UI to a git repository.
MongoDB cloud backups
For AWS deployments, MongoDB data is automatically backed up daily and uploaded to S3. The most recent few days of backup are also available on the central server under /domino/backup/mongodb
. To confirm backups are being generated, SSH into the server hosting the Domino web interface, and run ls /domino/backup/mongodb/
. You should see output like this:
ls /domino/backup/mongodb/
20171112-0651.tar.gz 20171115-0650.tar.gz 20171118-0637.tar.gz
20171113-0649.tar.gz 20171116-0647.tar.gz 20171119-0645.tar.gz
20171114-0630.tar.gz 20171117-0642.tar.gz 20171120-0629.tar.gz
MongoDB manual on-premises backups
For on-premises deployments, backing up the Mongo database involves exporting the contents of the database to disk and then backing up the resulting files. Depending on the number of runs and the size of run outputs in Domino, the Mongo database can be larger than 10GB. It can be backed up without downtime with the following procedure:
-
Determine the Mongo server address and the password for the
domino
Mongo user:-
Sign in to Domino’s web interface as a user with administrative privileges.
-
Click your username on the top right, then click Admin.
-
Click Advanced, then click Central Config.
-
Look for the line with key
mongodb.default.uri
. Its value should be something like:mongodb://domino:kyf9C9b6OJ9gwI1GfNIb9iWUOLSSZK31@13.0.128.30:27017/domino?connectTimeoutMS=60000&socketTimeoutMS=60000&maxPoolSize=1000
The password is delimited by
domino:
on the left and@
on the right. In the example above the password iskyf9C9b6OJ9gwI1GfNIb9iWUOLSSZK31
. Record your database’s password for use in the next step.The server address is delimited by
@
on the left and:27017
on the right. In the example above the address is13.0.128.30
. Record this address for use in the next step.
-
-
SSH into the Mongo server using the address from the previous step.
-
Navigate to a directory with enough free space, e.g.
/tmp
. -
Execute the following command to export the Mongo data to disk, replacing
$PASSWORD
with the password you retrieved and recorded earlier:mongodump -u domino -p $PASSWORD -d domino -o domino_mongo_backup
This command exports the database contents to a new directory named
domino_mongo_backup
. This is the directory you need to back up. You can compress it to a single file by running:tar czf domino_mongo_backup.tar.gz domino_mongo_backup