View previous topic :: View next topic |
Author |
Message |
bishop00s Tux's lil' helper
Joined: 27 Apr 2004 Posts: 139 Location: pittsburgh, pa
|
Posted: Thu Sep 09, 2004 5:18 pm Post subject: archiving files |
|
|
i want to periodically (i.e. daily) archive a directory to a smaller disc. can i use vixie-cron to schedule a daily action to archive and compress (gzip?) all files and subdirectories in a given directory? |
|
Back to top |
|
|
Crisis l33t
Joined: 10 Feb 2003 Posts: 613 Location: Portland, OR
|
Posted: Thu Sep 09, 2004 6:46 pm Post subject: |
|
|
Sure can.
What part do you need help with, the scriping, or scheduling it?
Basically you are going to want to make a small shell script to do the backup, and then make a crontab entry to schedule it.
type crontab -e to edit your crontab.
Then to add an entry that runs each night at midnight, for example, you might do something like this:
0 0 * * * /home/crisis/backup.sh > /dev/null
In your shell script you are going to want to perform your file operations, which are really up to you. You may want to rename old backup files etc first, or you may simply want to remove the old one and replace it with the new file.
check out the man page for tar, and I bet you will find a solution that fits your needs using tar and possibly passing it through bzip (or gzip etc) to compress.
I do something similar in that my file server backs up data on all the machines on my network via crontabs and shell scripts |
|
Back to top |
|
|
Tsonn Guru
Joined: 03 Jun 2004 Posts: 550
|
Posted: Thu Sep 09, 2004 8:40 pm Post subject: |
|
|
You might want to look into use of rsync for backups. You can use it to set up scripts which just store the differences between backups, rather than the whole thing each time; thanks to hard linking it looks like each one is a full copy.
(note: in fact it stores a complete copy of any file which has changed at all, so there's only a benefit if most files stay the same most of the time) |
|
Back to top |
|
|
bishop00s Tux's lil' helper
Joined: 27 Apr 2004 Posts: 139 Location: pittsburgh, pa
|
Posted: Fri Sep 10, 2004 4:36 am Post subject: |
|
|
thank you for your reply. could you give me an example of how to create a script that would invoke tar and *zip or using rsync, please. I'm not a programer so i'm not familiar with doing this. what compressor do you think best? |
|
Back to top |
|
|
Tsonn Guru
Joined: 03 Jun 2004 Posts: 550
|
Posted: Fri Sep 10, 2004 11:15 am Post subject: |
|
|
Well, if you use the rsync approach, you won't want to use compression or tar; you'll end up with complete directory trees which you can navigate into and access as if they're full copies, when in fact any file which doesn't change is only on disk once.
This is the script that I wrote for the purpose:
Code: |
#!/bin/bash --
## Commands
ECHO=/bin/echo;
MV=/bin/mv;
CP=/bin/cp;
RSYNC=/usr/bin/rsync;
TOUCH=/bin/touch;
LS=/bin/ls;
SORT=/usr/bin/sort;
RM=/bin/rm;
CAT=/bin/cat;
## Config
DESTINATION=/root/backup;
## Setup variables
NEWEST="none yet"
## Mount the destination
$ECHO "Mounting $DESTINATION.";
mount $DESTINATION
if(( $? )); then
$ECHO "Could not mount $DESTINATION.";
exit;
fi;
$ECHO "$DESTINATION mounted.";
$ECHO "Shuffling backups.";
## Move backups along one
for CURRENT in $($LS $DESTINATION | $SORT -nr); do
if([ $CURRENT != lost+found\
-a -d $DESTINATION/$CURRENT ]); then
NEXT=$(($CURRENT+1));
FROM=$DESTINATION/$CURRENT;
TO=$DESTINATION/$NEXT;
$MV $FROM $TO;
NEWEST=$NEXT;
fi;
done;
$ECHO "Backups shuffled. Newest directory: $NEWEST."
$ECHO "Linking new backup.";
if [ -d "$DESTINATION/$NEWEST" ]; then
$CP -al "$DESTINATION/$NEWEST" $DESTINATION/0;
else
mkdir $DESTINATION/0;
mkdir $DESTINATION/0/root;
fi
$ECHO "New backup linked.";
$ECHO "Running rsync.";
for FOLDER in $($LS /root/etc/backup); do
TARGET=$($CAT /root/etc/backup/$FOLDER);
$RSYNC -va --delete $TARGET $DESTINATION/0/root/$FOLDER;
done;
$ECHO "New backup updated.";
$ECHO "Pruning.";
PREV=0;
SCORE=0;
for CURRENT in $($LS $DESTINATION | $SORT -n); do
if([ $CURRENT != lost+found\
-a -d $DESTINATION/$CURRENT ]); then
SCORE=$(($PREV-($CURRENT/2)));
if([ $SCORE -gt 0 ]); then
SCORE=0;
if([ $(($RANDOM & 63)) -eq 0 ]); then
echo "$CURRENT scores $SCORE, but I'll keep it this time.";
$RM -rf $DESTINATION/$CURRENT
else
echo "$CURRENT scores $SCORE, removing.";
fi;
else
echo "$CURRENT scores $SCORE, keeping.";
PREV=$CURRENT;
fi;
fi;
done;
$ECHO "Unmounting $DESTINATION.";
$TOUCH $DESTINATION/0;
umount $DESTINATION
if(( $? )); then
$ECHO "Could not unmount $DESTINATION.";
fi;
$ECHO "$DESTINATION umounted.";
|
Configuration is easy: set the DESTINATION variable to be wherever your backup partition is; if it's not a partition, you'll want to remove the parts of the script that do mounting/unmounting. But, it's a good idea to have things on a different partition, and if possible a different hard disk.
Then, make a folder /root/etc/backup. In that folder you put a file for every folder you want backing up. For example:
Code: |
mkdir /root/etc
mkdir /root/etc/backup
cat '/etc/' > /root/etc/backup/etc
cat '/home/david/' > /root/etc/backup/home
|
That will back up /etc and /home/david. Make sure you include the trailing slashes!
So, onto what it does; you'll end up with a whole bunch of folders numbered numerically, with the lowest numbers being most recent. In each one will be (apparently) a complete copy of the folders specified. Each time it's run there will be a chance of removing one of the older backups... I'm not 100% happy with the way this works, but it does what I want... it keeps more recent backups than older backups.
Anyway, there's no harm in trying it; provided you set DESTINATION to an empty folder, I can't see it damaging anything. Once you're convinced it works, add a cron job to run it; I had it running every hour on my previous system (haven't set it up on the new one yet). |
|
Back to top |
|
|
bishop00s Tux's lil' helper
Joined: 27 Apr 2004 Posts: 139 Location: pittsburgh, pa
|
Posted: Fri Sep 10, 2004 5:34 pm Post subject: |
|
|
If i understand correctly i would just copy this code (modified for my system) into Emacs and write it out as a file such as backup.sh then set my crontab to run it when I want. Is that right? Ill give it a try tonight.
The problem I might run into is the disk Im backing up to older and is only 1.9 GB. I wanted to use compression so I can fit more data onto it. Can you show how I could write a separate script that I could use with compression. Thanks for the help. |
|
Back to top |
|
|
BlackEdder Advocate
Joined: 26 Apr 2004 Posts: 2588 Location: Dutch enclave in Egham, UK
|
Posted: Fri Sep 10, 2004 5:51 pm Post subject: |
|
|
I've made a script that backups only the changed files every day and makes a full backup once a month. Both backs them up to $TEMPDIR and then ftps them to our backupserver (I removed the ftp bit). It names the files: am_DAYofthemonth_MONTH (for daily backups) and am_MONTH_YEAR (for full backups)
Code: |
#!/bin/sh
#once a day:
DIR=Adminmod/scripting/myscripts
FILENAME=am
DAY=$(date +%d)
MONTH=$(date +%m)
TEMPDIR=$HOME/tmp
cd $HOME
find $DIR -mtime -2 -type f -print > $TEMPDIR/$FILENAME\_$DAY\_$MONTH.txt
if [ ! -s $TEMPDIR/$FILENAME\_$DAY\_$MONTH.txt ]
then exit
fi
cat $TEMPDIR/$FILENAME\_$DAY\_$MONTH.txt | tar zcvf $TEMPDIR/$FILENAME\_$DAY\_$MONTH.tgz -T -
|
Code: |
#!/bin/sh
#complete backup every month
DIR=Adminmod
FILENAME=am
MONTH=$(date +%m_%Y)
TEMPDIR=$HOME/tmp
cd $HOME
tar zcvf $TEMPDIR/$FILENAME\_$MONTH.tgz $DIR
|
|
|
Back to top |
|
|
Tsonn Guru
Joined: 03 Jun 2004 Posts: 550
|
Posted: Fri Sep 10, 2004 7:00 pm Post subject: |
|
|
zion1 wrote: | If i understand correctly i would just copy this code (modified for my system) into Emacs and write it out as a file such as backup.sh then set my crontab to run it when I want. Is that right? Ill give it a try tonight.
The problem I might run into is the disk Im backing up to older and is only 1.9 GB. I wanted to use compression so I can fit more data onto it. Can you show how I could write a separate script that I could use with compression. Thanks for the help. |
Yes, that's right.
I don't know of any way of using compression with the method I've mentioned; in any case that would remove one of the main advantages, which is that you can see the files immediately. If you do need compression you might be better off using BlackEdder's scripts; you'll have to untar the backups to access them, but they'll take up less space. |
|
Back to top |
|
|
bishop00s Tux's lil' helper
Joined: 27 Apr 2004 Posts: 139 Location: pittsburgh, pa
|
Posted: Mon Sep 20, 2004 2:44 am Post subject: |
|
|
26199,
i tried your rsync sript and changed the DESTINATION= line to read Code: | DESTINATION=/mnt/backup; |
which is where /dev/hdc1 is mounted
i mounted /mnt/backup and the only thing i see is a file called 0
what am i doing wrong?
sorry i haven't had the opportunity to post sooner |
|
Back to top |
|
|
Tsonn Guru
Joined: 03 Jun 2004 Posts: 550
|
Posted: Mon Sep 20, 2004 11:16 am Post subject: |
|
|
It puts each backup in numbered folders... 0 is the most recent. Hopefully it contains files?...
If so, that's what you should get if you only ran the script once...
If there aren't any files... did you remember to set up what you wanted backing up? I described how to do that at the end of my post... _________________ If your question was answered, please edit the first post and add [SOLVED] to the title. Thanks! |
|
Back to top |
|
|
bishop00s Tux's lil' helper
Joined: 27 Apr 2004 Posts: 139 Location: pittsburgh, pa
|
Posted: Mon Sep 20, 2004 1:33 pm Post subject: |
|
|
26199 wrote: | If there aren't any files... did you remember to set up what you wanted backing up? I described how to do that at the end of my post... |
Im a moron, apparently i can't follow directions, thanks _________________ Grateful for professional quality graphics in the free world!
blender.org, gimp.org, inkscape.org! |
|
Back to top |
|
|
|