PostgreSQL Backup PERL Script
News
[06-jun-2007] Version 0.93 released, with bug fixes and improvements. Get it at the download page.
Introduction
This script was made with the intent of remotely organizing and maintaining PostgresSQL database backups. It uses the 'psql' binary to remotely connect and generate backups for the target databases, and only of the tables that need backup. On large databases, backuping the whole schema generates too much data.
Status
The current version is marked as Beta for a single reason: It has been only tested in my own databases.
It has already being used for over 6 months without problems, but if you decide to used it, it is HIGHLY RECOMMENDED THAT YOU VALIDATE THE BACKUPS generated before you start trusting them.
Requirements
- You need to have PERL installed.
- You need the postgres binaries (psql executable) installed on the same machine as the script. (you don't need to run the script on the database machine).
- You will probably need to install the module XMLSimple in your box for the script to run. To check if you do, try to run: 'perl pg_backup_manager.pl'. If you get a prompt, you don't. If you get an error message you do.
- If you want to get email notifications in case of database structure changes, you will also need to install and setup the msmtp SMTP client.
Download
Features
- Multiple Database Backup from a single machine.
- Partial Database backup. Backup only the tables that need backup.
- Easy XML configuration
- Command line prompt for single backup generation/recovery
- Automatic backup generation for scheduler (crontab) task.
- Automatic Alarm generation if the database structure changes without changing the backup definition file.
Documentation
The first thing you need is to make sure the script has the right path to the postgres binaries. so if your postgres installation is not in /usr/local/pgsql/bin, edit the script and change the path to the right one.
XML Files
When run in automatic mode, the script will generate one backup for each .bkp.xml file it finds inside the ./data subdir.
These Files are in the format:
<?xml version='1.0'?>
<database ip="999.999.999.999" port="5432"
name="DATABASE_NAME"
user="USERNAME"
password="PASSWORD"
schema="SCHEMA"
save_dir="/place/to/put/the/backups/"
excludedTables="SKIP_TABLE1, SKIP_TABLE2, SKIP_TABLE3"
mailTo="sendWarningsTo@databaseadmim.com"
keep_last="NUMBER_OF_BACKUPS_TO_KEEP">
<table name="BACKUP_THIS_TABLE" type="full"/>
<table name="BACKUP_THIS_TOO" type="full"/>
</database>
Where:
- database: has the information about how to connect to the database.
- ip: is the database machine ip address.
- name: is the database name.
- user: is the database user to use.
- password: is the users password.
- schema: is the database schema to get the tables and structure from.
- save_dir: this is the place in the local machine where the backups will be stored. Each backup will be generated with the database name plus the date of the backup.
- excludedTables: this is the list of tables that are known not to need backup. If a table is in the schema but is neither in the backuped tables nor in this list, a warning notification will be issued to the configured email. This is very useful for those last minute changes made to the database that nobody will remember to notify the administrator. That is, until a backup of it is needed.
- mailTo: The email that will receive the notifications. Remember that you need msmtp installed for this to work.
- keep_last: the number of backups to keep at the save_dir folder. if this attribute is set to 10, at the time of the 11th backup, the 1st one is deleted.
- table: information on how to backup this table.
- name: is the table's name.
- type: is the type of backup to perform. Currently the valid types are: full: the entire table is backed-up; daily: only the last day information on the table is backuped up. in this case a third attribute: field_name is required. And it MUST be equal to the name of the field in the table that has the date of the information. (In my experience it is not wise to mix those two types, because the recovery of daily backups is not implemented and you will probably want to save every daily backup as is not the case with full backups).
Command Prompt
For quick testing and exporadic backups, the command prompt can be used. Just run the script without arguments.
The followin commands will be available:
- select <FILE NAME>: Select a given .bkp.xml file name for other operations.
- run: Executes the backup procedures of the selected file.
- restore: Drops (If already exists) and creates a database using the selected file.
- list: Shows all backup files in the data directory.
- backups: Shows all backup data files in the save_dir directory of the currently select database.
- quit(q): Exits the application.
- help: Shows this help message.
Command Line
- For automatic backup (crontab task), just run the script with the 'run' parameter (pg_backup_manager.pl run). All .bkp.xml files will be processed and one backup for each will be generated.
- For automatic recovery, run the command line with the following parameters pg_backup_manager.pl restore file=name_of_the_bkp_file(extension not needed) ip=place_to_ restore_the_backup_into database=database_to_restore_the_backup_in user=user_to_use schema=schema_to_use'. I CANNOT OVERSTATE THE FACT THAT THIS SHOULD NOT BE USED LIGHTLY. Always prefer the command prompt when restoring a backup, it will give you a change to confirm what you are doing. I recomend only using this automatic feature to create a live backup for production databases, using the main database backup as a source for creating the live backup data.
- Remember, a Restore will drop the database and replace it with the backup data. BE CAREFUL.
TODO
- Create Web Interface for handling the scripts.
- Automatic Recover for daily backups.
Help Wanted
I have very little time right now to work on this project, so if you think you can help improve it, please contact me.
Contact
If you have some question and or improvement of the script, please email me: marcio.pgbackupscript@gmail.com. As any expert PERL programmer will notice, I am not one. So optimizations are welcomed.