Intranets-Talk

I've been using Vim for about 2 years now, mostly because I can't figure out how to exit it

Use Cron to automate backups of MySQL databases and web folders to an FTP server or other locations (Ubuntu)

Some time ago I wrote a couple of mini how-to’s on how to automate backups of MySQL databases and web folders to have them uploaded to an FTP server – on a Windows server, using the 7zip executable (here and here. Today we’ll look at achieving the same – this time in Linux (LAMP).

Obviously, this short tut is not specific to Ubuntu only; Cron (daemon) and Bash scripts can be run on a wide range of Linux or UNIX distros. I will be using Ubuntu 14.04, hence the naming.

Also, there are many ways to get the same operations done under Bash scripting, this is not “only” way to perform automated backups – but it is one of the simplest.

Before moving on – in order to keep things simple – I am making the following assumptions:

  • We are setting this up (Cron and the backup script) on the same server that hosts the MySQL databases and web folders (the script for remotely hosted databases is slightly different)
  • We'll be working inside the user's Home folder, so becoming root will not be required
  • A word on security: the script will contain FTP credentials and other info. Anyone with access to your home folder will be able to see that information; and...
  • FTP connections are not regarded as secure. This is just a base guide.
  • Web server files are located at /var/www and my home folder is located at /home/gab
  • My FTP server has the following IP address: 192.168.0.5
  • We are backing up a single MySQL database - called database1

backup_sh.png

Ok, let’s shuffle up the sleeves and get started. We’ll first create two new folders inside the home folder: scripts (to hold our script which will be executed by Cron) and temp_backups (to hold temporary backups). Bring up the terminal with Ctrl + Alt + T and switch into your Home folder:

cd /home/gab
mkdir scripts && mkdir temp_backups

We’ll not be using the existing /tmp or /temp folder, as this folder needs to be cleaned up after each Cron operation. People have all sorts of things into their /tmp folder, better leave that untouched. Let’s switch into the scripts folder:

cd /home/gab/scripts

Use Nano to create our script:

nano backups.sh

Our script will consist of 5 sections (whole script at the bottom of article):

1. Provide login credentials for the FTP server (simple and self-explanatory)

#FTP server details
SERVER="192.168.0.5"
USERNAME="ftp_username"
PASSWORD="ftp_password"

2. Specify a location (folder) on the FTP server to where the backed up (archived) files will be uploaded; I keep the backups in a folder called weekly_backups

# Set upload location on FTP server
BACKUPDIRECTORY="weekly_backups"

3. Grab a snapshot of the web server folders along with a dump of a single MySQL database (database1), and archive them; we’ll end up with two archives, which we’ll also further archive into one single file

# Generate backup files (whole www folder + mysql dump from single database)
cd /home/gab/temp_backups && tar -zcf www_`date +\%Y\%m\%d_\%H\%M`.tar.gz /var/www
mysqldump -u root -ppassword database1 > database1_`date +\%Y\%m\%d_\%H\%M`.sql
tar -zcf mysqldump_www_`date +\%Y\%m\%d_\%H\%M`.tar.gz /home/gab/temp_backups

If we wanted to backup 3 MySQL databases, the mysqldump line would look like so:

mysqldump -u root -ppassword --databases database1 database2 database3 > database123_`date +\%Y\%m\%d_\%H\%M`.sql

And again, if we wanted to backup all MySQL databases, our mysqldump line would look like so:

mysqldump -u root -ppassword --all-databases > all_databases_`date +\%Y\%m\%d_\%H\%M`.sql

Note that I am using the MySQL root and password credentials. Also, there is no space between “-p” and “password”. If your password is “TooSexyForMyServer”, this would look like “-pTooSexyForMyServer”.

4. Uploading to FTP server

# Login to remote FTP server and perform the upload
ftp -n -i $SERVER <<EOF
user $USERNAME $PASSWORD
cd $BACKUPDIRECTORY
mput mysqldump_www*.tar.gz
quit
EOF

5. Cleaning up

# Empty folder for next job
rm *.tar.gz && rm *.sql

Since the third step left us with three files inside the temp_backup directory (www_datestamp.tar.gz, database1_datestamp.sql and mysqldump_www_datestamp.tar.gz) – we’ll need to clean up the folder for the next backup job (with our upload reg-ex being very simple, if we don’t clean up we’ll end up uploading same files over and over again). Step five above takes care of that.

Let’s save the file and exit: Ctrl + O, Enter, Ctrl + X, Enter

Before proceeding with Cron we need to make our backups.sh file an executable:

chmod +x /home/gab/scripts/backups.sh

Time for Cron

Ok, so far so good. The standard pattern for using Cron is:

cron_t.png

I would like my backup script to run at 1:00AM every Saturday – so we’ll need to set this up accordingly. Our job pattern will look like this:

Let’s bring up the Cron editing interface:

crontab -e

cron_s.png

If this is the first time you are running this, you will be given 3 editor choices: ed, nano and vim.tiny

I’ll choose Nano (yes, I know Vim is powerful, but let’s keep it simple).

Once you’ve made your choice, you’ll see the crontab editor. Move your cursor all the way to the bottom, and on the last line add the backup script and schedule as per above:

* 1 * * 6 /home/gab/scripts/backups.sh

Kick Ctrl + O, Enter to save and exit. If you want to see the job lined up you can check with:

crontab -l

Also if you would like to test in order to see if the the backups are getting completed, you can modify it so it will run every minute (don’t forget to put it back on the normal schedule after that). You’ll need to modify the last line added in the crontab to look like that:

*/1 * * * * /home/gab/scripts/backups.sh

After a minute, check your FTP server upload location and after verifying that the backup has completed. Bring back the job to intended schedule after that.

The whole script:
 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
#!/bin/bash

# FTP server details
SERVER="192.168.0.5"
USERNAME="ftp_username"
PASSWORD="ftp_password"

# Set upload location on FTP server
BACKUPDIRECTORY="weekly_backups"

# Generate backup files (whole www folder + mysql dump from single database)
cd /home/gab/temp_backups && tar -zcf www_`date +\%Y\%m\%d_\%H\%M`.tar.gz /var/www
mysqldump -u root -pmypassword  --databases database1 > database1_`date +\%Y\%m\%d_\%H\%M`.sql
tar -zcf mysqldump_www_`date +\%Y\%m\%d_\%H\%M`.tar.gz /home/gab/temp_backups

# Login to remote FTP server and perform upload
ftp -n -i $SERVER <<EOF
user $USERNAME $PASSWORD
cd $BACKUPDIRECTORY
mput mysqldump_www*.tar.gz
quit
EOF

# Empty folder for next job
rm *.tar.gz && rm *.sql

I hope this was helpful, however this is just a simple script for performing backups. For example, instead of an FTP server, you can point to a cloud mirrored location (like Dropbox – surprise, surprise), you can use reg-ex expressions to filter out files and folders, etc.

However, there are more complex operations that Cron and Bash scripting can do – if you would like to dig deeper, have a look at the references below as a starting point.


References: