Home
Page

Revised
March 07, 2015

HOW TO
Incremental Backup Script

There are two scripts.   The script is quite flexable, so tailer for your own use.

The first script makes the backup to tar balls.   You can copy the script to make backups for various time periods.   Such as one for daily and one for weekly backup.   These are incremental backups, meaning that they copy only the files that have been modified over a period of time.   You can set which directories that are backed up, each having its own named tar ball.

The second script is used to remove outdated backups.   Hence you can do a daily backup, but set the cleanup script to remove backups that are older than x number of days.

Please any corrections, suggestions or general feedback about this page.

Note:  Current version of backup is 1.0.8, August 20, 2008 and for cleanup 0.9, August 14, 2008.

My Process:

This is just to give you an idea.

Short Term:

For directories that I may do a lot of changes to during the day.   Such as the directories that contain my web site.   So if I screw up, I can then go back and find the copy that I last saved during that time.   These backups are purged from the system after a few days by a cleanup script.

I run this backup every other hour, which is then purged after 4 days.   Also a daily every night, which is purged after 21 days.

Archived:

I do a weekly and a monthly backup.   These are then burned to a DVD.   Currently I am burning these to a DVD once a month.

Howto Set It Up:

For each backup, Hourly, Weekly, etc, this will have to be repeated.

Where to Store:

Storage should be in a partition that is on a separate hard drive.   For mine, I store the backups in /mnt/backup/.

Create a directory such as /mnt/backup/Daily or /mnt/backup/Monthly to put the backups in.

Download:

Click here to view the script in text.   Right click here to save as the shell script to /usr/local/bin, which is the proper location.

Rename /usr/local/bin/Backup_Current.sh to something like Backup_Daily.sh.   Create a different one for each backup.

Then make the script executable.   In file manager, right click to properties and check »is exacutable«.   Or like this:

$ cd /usr/local/bin

$ chmod +x Backup_*.sh    (Where * is your new name)

Configure:

Open /usr/local/bin/Backup_*.sh in a text editor.   Then find the "CONFIGURE SECTION".

STORAGE:  Set the path to the directory that you created above.

PATHLIST:  This lists all the directories that will be backed up.   Each directory will have it's own tar ball that is named for the directory.   Example /root will have the file name of root_2008-08-20-09.tar.gz

List the directories to be backed up with a space between then, example «/etc /home /root /srv«.

If you do /home/*, a separate tar bar will be created for each sub-directory in /home.

There are quite a few more comments in the script.

PRE:  This is the prefix to the tar ball file name.   Since I burn both Weekly and Monthly to DVD, I put them both in the same directory.   Hence they are different.   You can also just leave this blank.

TIMETYPE:  Set to either «day« or «minute«.   This sets how long the period will be in the next variable, PERIOD.

PERIOD:  Number of minutes or days that used.   Hence if "-2", each file that has been modified in the last two days will be backed up.

Note that the days goes till midnight.   So that if you do a daily backup 10 minutes after midnight and set PERIOD="-1" you get only the last 10 minutes, so set PERIOD="-2".   For mid-day backup use minutes, hence MINDAY="-mmin" and PERIOD="-1442" (24 hours times 60 minutes plus 2).   For a weekly backup in minutes use PERIOD="-10082"

FILEOWNER:  The User who will own the backup.   Could be yourself or root.

FILEGROUP:  The Group Name that will be added to the backup or just root.

PERM:  Set permissions.   If Group is not used, suggest 400.

Test:

# cd /usr/local/bin

# ls

# sudo Backup_Daily.sh     (or what ever you named it.)

With your file manager, check to see that the backup was made in your destination directory.

Schedule:

Setup /usr/local/bin/Backup_*.sh in cron to run at your chosen time.   Run as root. (Personally I like kcron.)

If you want to set up the script to remove the old backups, proceed to the next section.

Cleanup Script:

This script will be run to delete old backups.   If you are saving to CD or DVD, then you do not need this script.   If you have more than one backup directory, such as an hourly and a daily, you only need one cleanup script.

Download:

Click here to view the cleanup script in text.   Click here to save the cleanup shell script.   Save to /usr/local/bin, which is the proper location.

Configure:

Configure the two variables, STORAGE and DURATION as needed.

Comments:

I did back up a Microsoft® machine.   But have turned off the backup due to security concerns.   And now do not have any Microsoft machines in the house. Since you need access to the machine via share, this is not secure.

Hourly Scripts:

I run this backup every 1 or 2 hours.   But only the likely hours that it will be in use, such as from 7 AM to 9 PM at home and every hour from 6 AM to 6 PM on out Church server.   It is run just a few minutes after the hour.

I commonly back up the /etc, /home/* and /srv (which contains Samba file server and Apache web pages) directories.

I keep a copy for 4 days.

Daily Scripts:

Run every night, just after midnight . Basically same as the hourly, but I added some other directors such as /root. I have also added /var/* just to look at log activity, not to keep.

I are kept these for three weeks.

Archived Backups:

These are weekly and monthly backups.   I burn these to a DVD once a month.

Information Sources:

Warnings, and Notes:

Script file names end in ".sh".

Script files must have the Execute Permission enabled for it to run.

To learn about writing bash (Bourne again shell) shell script and the commands, a lot of information can be found on Linux Google.   Another source for beginners is "The Linux Terminal - a Beginners' Bash"

When setting "mtime -7" (see below), the -1 includes only the date the script was run.   Hence if you have a -1 and run the script just after midnight, only a few minutes of changes files will be captured.

Set the backup time to twice the frequency.   Example, if you are doing a weekly backup, backup the last 14 days.   If something happens to one of the backups, not all of the data for that week will be lost.

To run the script file manually, just click on the file name in the Konqueror File Manager once.

May sure you use the "if" command.   If the directory can not be accessed, the backup files will be stored in the /root directory.   Hence if you are backing up the /root directory, you are backing up backups.   Then the size of the /root backup will grow and you will run out of hard drive space in your / directory.   Not good.

Do not store the backup files on the same hard drive that you are backing up.   So if a hard drive dies, then all is not lost.

Microsoft uses spaces in some of their directory names.   A very bad practice.   Your script will not run right if that space is included.   I have not found a good answer on how to get around this problem.

Microsoft is funky about upper and lower case letters.   I have not been able to figure out how it works.   So suggest testing to see if it works.

Make sure the backup file that is created is not so large that it will not fit on one CD-ROM.   Also, a large backup file can take a long time to open.   A couple of hundred Mega Bytes can take a long long time to open.

A nice thing about Tar Ball (.tar.gz) files is that they can be opened just like a directory in Konqueror file manager.   You can then find the file needed and drag and drop it to copy or replace the file.

/home on Linux / UNIX and C:\WINDOWS\Profiles on Microsoft directories can get very large.   One reason for this is that an E-Mail Client does on compress or clean up it's e-mail folders.   Make sure that preference is set in the client.


If you have any comments, problems, questions or suggestions, please send me an e-mail at .

Any Browser Gates Free Clear

These pages are written to World Wide Web standards so that they will render correctly on any browser that is compliant.   They are also completely "Gates free", so you can read on any system.

Valid XHTML 1.0! Valid CSS! Clear

To validate this page, just click on one of the logos.

Linux Logo Clear

All software used is all Open Source.