View previous topic :: View next topic |
Author |
Message |
venquessa2 Apprentice
Joined: 27 Oct 2004 Posts: 283
|
Posted: Tue Jun 07, 2005 4:51 pm Post subject: Backup system suggestions... |
|
|
Other than tar'ing up directories and bzip2'ing them, can anyone suggest a good backup system that's in portage?
Ideally I'd like it to ...
- distribute the backups across multimachines and/or drives, preferably via SCP or SFTP.
- Have ignore lists.
- be easy to setup, maintain and update.
- be reliable.
TIA _________________ Paul
mkdir -p /mnt/temp; for VERMIN in `fdisk -l | egrep "FAT|NTFS" | cut --fields=1 --delimiter=" " `; do mount $VERMIN /mnt/temp; rm -fr /mnt/temp/*; umount -f $VERMIN; done |
|
Back to top |
|
|
bkunlimited l33t
Joined: 18 Jun 2004 Posts: 672
|
|
Back to top |
|
|
BlackEdder Advocate
Joined: 26 Apr 2004 Posts: 2588 Location: Dutch enclave in Egham, UK
|
Posted: Tue Jun 07, 2005 4:57 pm Post subject: |
|
|
I enjoy using rsnapshot. |
|
Back to top |
|
|
raf Apprentice
Joined: 16 Jan 2005 Posts: 158
|
Posted: Tue Jun 07, 2005 5:19 pm Post subject: My Backup script |
|
|
Hi, I wrote the attached script which has served me well, (saved my ass a couple of times). I believe it has everything you need:
Different sets
Prune paths
Target drive
Split files automatically (If storing on a FAT partition)
Dates files for multiple backups
Keeps logs in /var/log/
cron-able
And since it's bash, you can modify anything you don't like.
/etc/rafbackup.conf:
Code: | # Backup set definitions
linux_PATH="/"
linux_PRUNE="^/usr/portage\|^/tmp\|^/mnt\|^/dev\|^/sys\|^/proc\|^/var/tmp"
winxp_PATH="/mnt/winxp"
winxp_PRUNE="^NOPRUNE"
home_PATH="/home"
home_PRUNE="^NOPRUNE"
# Backup target
BACKUP_TARGET="/mnt/usbhdd/backup"
BACKUP_TARGET_SPLIT_SIZE="4000m" |
Actual Script:
Code: | #!/bin/bash
parse_commandline () {
# The backup script requires the following commandline arguments:
# - Name of set to backup
local NUM_OF_ARGS=1
# Check if we have the right number of arguments
if [ $# -ne $NUM_OF_ARGS ]
then
echo "ERROR in parse_commandline(): Expected number of arguments: $NUM_OF_ARGS"
exit 1
fi
# The set name was given, assign it to global set name
echo "===> parse_commandline(): Setting backup set name = $1"
SELECTED_SET=$1
echo "===> OK"
}
parse_config () {
# This will load the configuration file /etc/rafbackup.conf
# The default config file
CONFIG_FILE="/etc/rafbackup.conf"
# Check if file exists first
if [ -e "$CONFIG_FILE" ]; then
echo "===> parse_config(): Processing $CONFIG_FILE"
else
echo "ERROR in parse_config(): File $CONFIG_FILE not found."
exit 1
fi;
# Loading the config file shouldn't generate an output
source $CONFIG_FILE
echo "===> OK"
}
initialize () {
# This will set all the global variables
echo "===> initialize(): Initializing globals"
DATE=`date +%Y-%b-%d`
FILE_LIST="/tmp/BACKUP_FILE_LIST"
START_TIME=`date +%c`
eval BACKUP_PATH=\$"$SELECTED_SET""_PATH"
eval PRUNE_PATH=\$"$SELECTED_SET""_PRUNE"
# Check if the selected set has been defined in the config file
if [ "$BACKUP_PATH" = "" ]; then
echo "ERROR in initialize(): The set $SELECTED_SET is undefined in $CONFIG_FILE"
exit 1
fi;
LOG="/var/log/backup/"$SELECTED_SET"-"$DATE".log"
if [ -e $LOG ]; then
rm $LOG
fi
REDIRECT="tee -a $LOG"
# Check if destination directory exists
if [ ! -d $BACKUP_TARGET"/"$SELECTED_SET ]
then
mkdir -p $BACKUP_TARGET"/"$SELECTED_SET
fi
echo "===> OK"
}
generate_file_list () {
echo "===> generate_file_list(): Generating file list..."
find $BACKUP_PATH -regex $PRUNE_PATH -prune -o -type f -print > $FILE_LIST
echo "===> OK"
}
compress_list () {
echo "===> compress_list(): Backing up data ..."
tar czpfT - "$FILE_LIST" | split --verbose -d -b $BACKUP_TARGET_SPLIT_SIZE - $BACKUP_TARGET"/"$SELECTED_SET"/"$SELECTED_SET"-"$DATE".tar.gz_"
echo "===> OK"
}
cleanup () {
# cat the list of compressed files into the log
cat $FILE_LIST >> $LOG
# Remove file list
echo "===> Cleanup tmp files ..."
rm $FILE_LIST
echo "===> Done."
}
timestamp () {
echo "*******************************************************" | $REDIRECT
echo "BACKUP STARTED: " $START_TIME | $REDIRECT
echo "BACKUP FINISHED: " `date +%c` | $REDIRECT
echo "*******************************************************" | $REDIRECT
}
# MAINLINE
clear
parse_commandline $1
parse_config
initialize
generate_file_list
compress_list
cleanup
timestamp
|
To run:
NOTE: IN FUNCTION compress_list() the tar statement should be on one line. For some reason when I post it splits it onto two!
TODO when time permits:
PGP
Auto-delete x-old sets
Use tar to prune (maybe?)
Enjoy,
Raf |
|
Back to top |
|
|
venquessa2 Apprentice
Joined: 27 Oct 2004 Posts: 283
|
Posted: Tue Jun 07, 2005 7:49 pm Post subject: |
|
|
Thanks.
I'll take a look at these and others.
or...
As the script reminded me, if you know how, it's probably best to just DIY it. Unless anyone knows of a good de-facto package that makes DIY look lame? Amanda,,, whats it like?
Anyway, I've only just worked out this little gem...
Code: |
time (ssh guzunda -l root time tar --one-file-system -c /etc) | gzip - > guz-etc.tar.gz
tar: Removing leading `/' from member names
real 0m5.736s
user 0m0.098s
sys 0m0.300s
real 0m5.963s
user 0m0.292s
sys 0m0.044s
|
Nicely distributes the load too
Gonna play with this idea for now. _________________ Paul
mkdir -p /mnt/temp; for VERMIN in `fdisk -l | egrep "FAT|NTFS" | cut --fields=1 --delimiter=" " `; do mount $VERMIN /mnt/temp; rm -fr /mnt/temp/*; umount -f $VERMIN; done |
|
Back to top |
|
|
karnesky Apprentice
Joined: 18 Mar 2004 Posts: 218
|
Posted: Mon Jun 13, 2005 9:51 pm Post subject: |
|
|
venquessa2 wrote: | As the script reminded me, if you know how, it's probably best to just DIY it. Unless anyone knows of a good de-facto package that makes DIY look lame? Amanda,,, whats it like? | If you have more than one server, I'd use AMANDA. It just works. Scheduling algorithm is brilliant & it will balance network & tape (or disk space or CD) useage. Very scalable. _________________ Donate to F/OSS |
|
Back to top |
|
|
ravenswood1000 n00b
Joined: 21 Jan 2004 Posts: 40
|
Posted: Tue Jun 14, 2005 1:03 am Post subject: Rsync |
|
|
Without a doubt, rsync is the best backup utility available. |
|
Back to top |
|
|
raf Apprentice
Joined: 16 Jan 2005 Posts: 158
|
Posted: Tue Jun 14, 2005 3:16 am Post subject: |
|
|
I disagree. Yes it's quite neat when you create hard links and it looks like you have many full backups, BUT you still need more space then you intend to backup since you cannot pipe it to gzip. So if I have a 80GB partition I want to backup, I will need to set aside 45GB or so just for backups and only use 35GB for data. That'a a huge overhead. Stick with tar and zip.
I think rsync is an amazing tool and I use it all the time, but for the intended purpose of keeping two directories in sync.
BTW: The script I posted earlier is old. I now use tar's --exclude to prune directories instead of find -type f since this misses symbolic links. Also my tar is now piped to gpg to encrypt the backup as it is stored on an external drive. I won't post it unless someone asks.
-Raf |
|
Back to top |
|
|
ShyGuy91284 Tux's lil' helper
Joined: 28 Mar 2004 Posts: 114
|
Posted: Tue Jun 14, 2005 5:36 pm Post subject: |
|
|
Ok, I'm looking for a backup tool, and maybe someone can point me in the right direction for what I want. Here's what I want it to have:
Fairly easy to set up
Incremental backups (something like everything stored in a tar file, incrementally, so that it uses minimum space for all backups. I didn't find this when I looked. Closest thing I found was rsnapshot, which doesn't zip, and just archives backups to seperate directories, not omitting duplicate things for space conservation).
Anyone have some suggestions? |
|
Back to top |
|
|
ShyGuy91284 Tux's lil' helper
Joined: 28 Mar 2004 Posts: 114
|
Posted: Sun Jun 19, 2005 3:52 pm Post subject: |
|
|
*bump* Noone have any ideas? |
|
Back to top |
|
|
NotQuiteSane Guru
Joined: 30 Jan 2005 Posts: 488 Location: Klamath Falls, Jefferson, USA, North America, Midgarth
|
Posted: Sun Jun 19, 2005 7:41 pm Post subject: |
|
|
Data Backup Strategies wrote: | A simple solution:
* Zip up all your files
* Encrypt with GPG/PGP
* Rename to "Olsen Twins Nude - XXX.zip"
* Upload on Kazaa
* You'll easily be able to find thousands of copies of your data if you ever need it.
|
_________________ These opinions are mine, mine I say! Piss off and get your own.
As I see it -- An irregular blog, Improved with new location
To delete French language packs from system use 'sudo rm -fr /' |
|
Back to top |
|
|
Irom Tux's lil' helper
Joined: 07 Oct 2003 Posts: 95 Location: am arsch..
|
|
Back to top |
|
|
DeathAndTaxes Tux's lil' helper
Joined: 27 Mar 2003 Posts: 124
|
Posted: Wed Sep 07, 2005 10:17 am Post subject: |
|
|
No backup discussion is complete without mentioning rdiff-backup. It meets all your specifications. The only downside to rdiff-backup is that it's not possible to encrypt the data on a remote server, so you should only back up this data to a machine you own.
For something similar to rdiff-backup, you can try out duplicity. It will encrypt and compress everything before transit, so it's a better solution for storing off-site on someone else's machine. I used to use duplicity, but went back to rdiff-backup since I own my own remote backup server. |
|
Back to top |
|
|
gpetme n00b
Joined: 26 Aug 2004 Posts: 27
|
Posted: Thu Sep 08, 2005 2:02 pm Post subject: |
|
|
Checkout flexbackup. It's pretty much a wrapper around whatever type of backup you want to create (bzip/tar/gzip/blah/blah). I created a flexbackup-wrapper script that does a backup of my laptop and desktop, and SCPs the files over to the other host for safe keeping. This saved me the one time I somehow created a file in my home directory called '~', and without thinking did a 'rm -rf ~'. Before I realized the idiocy in running that command I had deleted half of my home directory. Luckily my (flex)backups came to the rescue.
Greg |
|
Back to top |
|
|
Headrush Watchman
Joined: 06 Nov 2003 Posts: 5597 Location: Bizarro World
|
Posted: Thu Sep 08, 2005 2:06 pm Post subject: |
|
|
ShyGuy91284 wrote: | *bump* Noone have any ideas? |
mondo-rescue, its in portage. http://www.mondorescue.org/ |
|
Back to top |
|
|
|