Setting up backup for a headless LAMP stack using Dropbox

I currently run a LAMP stack, and I was in the need of a simple automatic backup process for my websites. This article will show you how you can easily implement backup for the services implemented through Dropbox, crontab and with a retention period to avoid running out of storage space.

Why Dropbox? Why not? It is a simple and free cloud service which takes care of a lot of the backup process for you. Keep it simple stupid! 🙂

Getting Dropbox to run on Ubuntu headless

The first goal is to get Dropbox running. If you are running Linux headless like I am, you need some way to install Dropbox, hook it up with your account and start synchronizing files.

First you need to download Dropbox onto your server. Dropbox has a downloadable deb package you can use for installation on their websites. On the this link you can get the Debian package. Simply find the link to the deb package, copy the URL and on  your server issue the wget command to fetch the file, like this:install_graphic-vflx6Z89X

wget -O dropbox.deb

Just replace the link with whatever is current on the Dropbox website.

Step 2 is to install the deb file. Issue the following command to initiate the installation:

 dpkg -i dropbox.deb

After installation the ‘dropbox’ command should be available on your shell prompt.

Just installing the package wont complete the entire Dropbox installation for you. To continue with the installation run dropbox install command:

dropbox start -i 

This will initiate the rest of the installation. Next is the account set up, you need hook up the account with a valid Dropbox account. During the installation you be prompted with an unique URL. Visit this URL to finish the installation process.

When visiting the URL you need to login with the Dropbox account you want to connect your LAMP stack with. When this is complete you will get a message from Dropbox saying that you have successfully connected the computer to your Dropbox account.

Issue the ‘dropbox start’ command to start the Dropbox background daemon.

Setting up backup of your web root

The main part of both the web backup and MySQL backup is to get the necessary files into the Dropbox folder. This will automatically trigger the Dropbox daemon and start uploading the files to the cloud.backup

To start, create a file called In this file include the following content:

tar -zcf /path/to/your/dropbox/backup/webroot-`date +%s`.tar.gz /var/www/

Update the script with the correct path to your Dropbox folder. Also if your web root is located somewhere else than /var/www/ be sure to replace that as well.

This script will gzip and tar all the files in the /var/www/ directory with a file name changing with every run. The Dropbox daemon will then take care of uploading the files to the cloud.

Test the script by running it: ./

PS: Remember that you may need to chance permissions on the file to make it executable. Run chmod 700 ./ to make it executable by the file owner.

MySQL backup

First you should create a separate user to take care of the backup process for you. This user should have a unique and long password along with the least amount of privileges to do the backup. Using mysqldump you usually only need the SELECT privilege to do a dump of the database.

Connect to your MySQL database:

mysql -u root -p

Then issue the GRANT command to create a new user with the needed privileges:

grant select on database_name.* to ‘backup’@’localhost’ IDENTIFIED BY ‘your unique and strong password’;
flush privileges;

Replace database name with the name of the database you want to take backup of. The command ‘show databases’ will show you the different databases on your system. If you want to backup everything then you can replace the name with a star (*). You should now have a database user with a unique password and least amount of privileges.

Next step is to create a script that can do the backup for us. For this task I’ve used mysqldump. Put the following code into a .sh file.

FILENAME=/path/to/Dropbox/backup/mysql-`date +%s`.sql
mysqldump –user=backup –password=youruniquepassword > $FILENAME database_name
tar -zcf $FILENAME.tar.gz $FILENAME

This will issue mysqldump to to a dump of the database specified in the last argument. If you want to do a backup of all databases you need to have the sufficient privileges on the backup user and also use the –all-databases argument on the mysqldump command. The script will additionally tar and gzip the file.

Remove old backup files according to your retention period

When using this backup process you must take care not to fill up your Dropbox storage. Of course, Dropbox would be very helpful in selling you more storage space, however if you are like me, you need to delete old backups. I want my backup retention period to be 90 days, as I think that will be enough retention for my specific usage.

To find and remove any files that age older than 90 days I use the following script:

find /path/to/backup/* -mtime +90 -exec rm {} \;

Caution, this will remove any files in the folder that was created more than 90 days ago.

Add the above script to a file so that you can automate it in the next section.

Automating the scripts with Crontab

Of course you need to automate these scripts. I’d rather spend an extra hour automating something, than not to automate. In the case of backups I’d say it is not an option to not automate this process.

Young businessman standing over a young businesswoman sleeping on a couch

Crontab is nice to schedule automated jobs. Enter the crontab with the -e parameter like this:

crontab -e

The crontab accepts a space separated formatting where fields 1-5 are used to indicate how often the task should be executed. The fields translate into minute (0-59), hour (0-23, 0 = midnight), day (1-31), month (1-12), weekday (0-6, 0 = Sunday). The final parameter is the command to be executed.

To run both the the scripts every Monday at 01:00 add the following lines to your crontab:

0 1 * * 1 /path/to/script/
0 1 * * 1 /path/to/script/
0 1 * * 1 /path/to/script/

Adjust the numbers accordingly if you want to change the frequency of the backup scripts.


If you have anything to contribute to the scripts, or if you have a different way in implementing smart backup routines, let me know in the commend section below! I always appreciate feedback, whether it be constructive criticism or praise.

Chris Dale

I'm Chris Dale from Norway, founder and principal consultant at River Security ( Along with my security expertise, I have a background from system development and application management. Having a vast and broad experience in IT certainly help a great deal when working penetration tests and incidents.

I am an open, sharing and engaging person to be around, some even think I'm funny. I am usually enthusiastic and motivating when I work, and usually positive and optimistic about the general problems I encounter. I am passionate about security, both IT and physical security, which is one of the reasons I do a lot of public speaking at different events such as classes, conferences and workshops.

Driven by mottos such as "Magic is just science we don't understand yet" and "Think bad, do good", I attack today's security challenges with eagerness and enthusiasm. I consider myself a pragmatic person, with the ability to think outside the box, keeping the business in focus.

I also teach for SANS. My primary class I am teaching is Hacking Techniques, Exploits & Incident Handling. This course prepares you for the GIAC Certification in Incident Handling (GCIH). I find it extremely motivating and fun to teach others the art of security and hacking, and I often find that my passion and enthusiasm rubs off on my students.

One thought on “Setting up backup for a headless LAMP stack using Dropbox

  1. An outstanding share! I have just forwarded this onto a co-worker who had been doing a little homework on this.
    And he in fact ordered me dinner due to the fact that I stumbled upon it for him…
    lol. So allow me to reword this…. Thanks for the meal!!

    But yeah, thanks for spending time to discuss this matter here
    on your website.

Leave a Reply

Your email address will not be published. Required fields are marked *

Back To Top