How to backup your DSS data

August 22, 2016

We strongly recommend that you do periodic backups of your DSS data. We also recommend that you backup before upgrading DSS.

Your DSS Data Directory (or DATA_DIR) contains your configuration, your projects (graphs, recipes, notebooks, etc.), your connections to databases, the filesystem_managed files, etc.
Note that this directory obviously does not contains datasets stored outside of the server: SQL servers, cloud storage, hadoop, etc.

Full backup

The simplest way to backup your data dir is to do a FULL backup of the whole data directory folder:

  • Make sure you don't have any job running, then stop DSS:
    DATA_DIR/bin/dss stop
  • Compress your data directory:
    tar -zcvf your_backup.tar.gz /path/to/DATA_DIR/
  • Restart DSS:
    DATA_DIR/bin/dss start

Other backup methods

The above mentioned method using tar is very simple but always performs full backups, which might not be practical with large data dirs.

There are many other backup methods, and listing them all is outside of the scope of this document, but we can mention:

  • Using rsync for incremental backups (either on the same machine or another machine)
  • The duply / duplicity couple
  • FS-level snapshots (XFS snapshots for instance)

Important note: at the moment, the full consistency of the backup is only guaranteed if the backup was executed while DSS was not running. Note that all critical files of DSS are text files, which are written atomically, so partially-consistent backups (run while DSS was running) will always be mostly recoverable.

Restoring a backup

To restore a backup, you need to restore the files that you backed up to their original location.

"Pristine restore"

A pristine restore means a restoration of the backed up DSS data:

  • on the same machine as the original one
  • at the original location on the machine
  • on the same DSS version

For this kind of restoration, you simply need to replace the content of DATA_DIR with the content of the archive:

  • If applicable, stop the currently running DSS, and move away the current content of the DATA_DIR

  • Restore the backup

    cd DATADIR tar -zxvf yourbackup.tar.gz

  • Restart DSS

Restore on another machine, another location, or another DSS version

You can only restore a backup on a newer DSS version, not on an older one.

Restoration procedure:

  • If restoring on another machine, download and uncompress the DSS software on the new machine

  • Restore the backup files

    mkdir newdatadirlocation tar -zxvf your_backup.tar.gz

  • Replay the installer in "upgrade" mode: this will "reattach" the restored datadir to the installation directory. It will also, if needed, migrate to the newer DSS version:


  • If you installed the data dir on a different machine or in a different location, you need to rebuild the Python environment. See the "Migrating the data directory section" of our documentation on migrations

  • Replay the various "integration" setup scripts:

Running an automatic backup

Here is an example shell script that you can run periodically within a cron task.

#Purpose = Backup of DATA_DIR directory

/path/to/DATA_DIR/bin/dss stop

export GZIP=-9
TIME=`date +"%Y-%m-%d"`
tar -cpzf $DESDIR/backup-dss-data-$TIME.tar.gz $SRCDIR

/path/to/DATA_DIR/bin/dss start

#Optionally remove old backups
OLD=`date -d "4 days ago" +"%Y-%m-%d"`
rm -f $DESDIR/backup-dss-data-$OLD.tar.gz


Save this script in a file "" and set a cron task like the following one (running from Monday to Friday at 6:15am):

15 6 * * 1-5 /path/to/

What can be excluded from the backup

Temporary / Cache data

The data directory contains some folders which can safely be excluded from the backup because they only contain temporary data which can be rebuilt:

  • tmp
  • caches
  • elasticsearch
  • lambda-devserver


In addition, the following folders only contain log data, which you might want to exclude from backup:

  • run
  • jobs
  • scenarios
  • diagnosis

Datasets stored outside of DATA_DIR aren't affected by a DSS upgrade: they will still be available after the upgrade.

Other ignorable data

The following folders contain data which you might consider excluding:

  • exports (contains the data exports made by users)
  • prepared_bundles (contains automation bundles already exported)
  • apinode-packages (contains apinode packages already exported)


The following folders contain data built by DSS. This data can generally be rebuilt, but caution should be exercised when choosing whether to backup these folders:

  • managed_datasets
  • analysis-data
  • saved_lmodels

Applies to: DSS 3.0 and above