UPDATE: This script has been superseded by my Cloud Database Backup script. It’s advised you use that instead.
If you’re a developer you should back up your data. Go ahead – do it, do it now. Okay, now that we’ve got that out of the way here’s a nice automated solution to backing up your MySQL databases and optionally uploading to Amazon S3 for added safety.
The requirements for this tutorial are as follows:
- Any Linux distribution
- MySQL (and the mysqldump tool) with valid login credentials
- PHP 5 with cURL extension (optional)
- crontab
Our script will mysqldump all our databases into individual SQL files prepended with todays date, zip them, clean up the directory and if requested, upload a copy of the zip file to Amazon S3.
To get started, create a folder db_backup. For this tutorial I’ve placed db_backup in my home folder. In db_backup create a directory named backups and a text file named backup.bin with the following code replacing the DB and S3 variables at the top with your own information
#!/bin/sh DATE=`date +%y-%m-%d`#current date DIR=`dirname "$0"`/ #current directory #MySQL vars DB_USER=root DB_PASS=password DB_HOST=localhost #Backup directory vars BACKUP_DIR=${DIR}backups/ #S3 vars S3=0 #Set to 1 to back db up to S3 S3_KEY=Your S3 Key S3_SEC_KEY=Your S3 Secret Key S3_BUCKET=db_backups #Get list of dbs LIST=`mysql -u${DB_USER} -p${DB_PASS} -h${DB_HOST} INFORMATION_SCHEMA -e "SELECT SCHEMA_NAME FROM SCHEMATA WHERE SCHEMA_NAME !='information_schema';"` #Loop through list ignoring result table name for each in $LIST; do if [ "$each" != "SCHEMA_NAME" ]; then mysqldump -u${DB_USER} -p${DB_PASS} -h${DB_HOST} --opt --single-transaction $each > ${BACKUP_DIR}${DATE}_${each}.sql fi done #Zip and remove SQL dumps zip -qj ${BACKUP_DIR}dbBackup_${DATE} ${BACKUP_DIR}${DATE}_*.sql rm ${BACKUP_DIR}${DATE}_*.sql #Back up to S3 if [ ${S3} != 0 ]; then php5 -f "${DIR}S3/backup.php" "${S3_KEY}" "${S3_SEC_KEY}" "${S3_BUCKET}" "${BACKUP_DIR}dbBackup_${DATE}.zip" fi |
Here is S3/backup.php:
<?php require dirname(__FILE__).'/s3.php'; $S3_KEY = $argv[1]; $S3_SEC_KEY = $argv[2]; $bucket = $argv[3]; $file = $argv[4]; //file name including path $s3 = new S3(S3_KEY, S3_SEC_KEY); $key = basename($file); $contentType ="plain/text"; //Note: the number of metadata items associated with an object is not limited to 7 by S3 $s3->putObjectFile( $file, $bucket, $key, S3::ACL_PRIVATE, $metaHeaders = array(), $requestHeaders['Content-Disposition'] = "attachment", "sbnet-".date("d-m-y", time()).".sql" ); |
We’ll be executing db_backup.bin it’ll need to set as executable. While you’re at it give the script a test run. If you wish to use S3 you’ll also need to add the S3 directory from the tutorial download below.
cd ~/db_backup chmod +x backup.bin ./backup.bin |
As with most Linux tools, no news is good news – if you didn’t receive an error message when running the script, chances are it completed successfully. Open your backups directory (cd backups) and see your new shiny database backup zip file (ls).
Now for the actual ‘automatic’ part of our backups; for this we’ll be using crontab. In your terminal window type crontab -e. Depending on your setup this should open a text editor such as nano. With crontab you can set backups to occur as often as you like, at any time of the day you like. I’ve set mine to run on Fridays at 1AM since this is not likely to be a time when I’m actively using my PC.
# m h dom mon dow command 0 1 * * 5 /home/your_user_name/db_backup/backup.bin |
Save and exit the text editor (For nano this will be ctrl+x, y, enter). Easy!