Bash script to backup essential log files of Linux Server

by jagbir on February 16, 2010

Here’s small bash script to backup important log files from a server to a backup server. You should customize it per your environment. I’ve deployed this script in some hosts and its working fine for me but I’m not making any guarantee that this will work for you as well.

Task: Two most important log files in any Redhat based distro is /var/log/secure and /var/log/messages. These are basic log files and there are more log files when your server perform additional roles such as a database server, web server, mail server etc. You can look log files of other installed softwares also and add them in this script to backup them. I have a separate backup server where I want to transfer my log files after compressing them. You can transfer them in some location in case you dont have a separate backup host or environment.

#!/bin/bash
 
##
## hostlogBackup.sh: perform backup of essential log files. Developed by Jagbir Singh (contact AT jagbir DOT info)
## You are free to use or distribute it in whatever means but I'll be happy if you send me a copy of updated one. 
##
 
## create some varibales
 
yesterDate=`date -d "-1 day" +%d-%b-%g`  ## yesterday's date
toDay=`date +%u`;   ## day of week in numeric
bakServer="backup-user@server-ip" ## backup server address user@hostname, use directory name if backup in same host 
bakHost="$bakServer:/backup/host/firsthost" ## specify directory where log files will be copied
bakHostDaily="$bakHost/daily/" ## directory for daily backup files
 
cd /var/log ## change directory where important log file resides 
 
# compress messages log file
`cp messages messages-log`
`/bin/tar czf messages_$toDay.tgz messages-log`
 
# compress secure log file
`cp secure secure-log`
`/bin/tar czf secure_$toDay.tgz secure-log`
 
# compress mysqld log file. comment following 2 lines if you are not using mysql
`cp mysqld.log mysqld-log`
`/bin/tar czf mysqld_$toDay.tgz mysqld-log`
 
# compress apache log files. uncomment if your server runs apache service. 
#`cp httpd/access_log ./access-log`
#`cp httpd/error_log ./error-log`
#`/bin/tar czf httpd_$toDay.tgz access-log error-log`
 
 
#copy all compressed files to backup server, you must set secure authentication for password less scp, else you have to enter password
`/usr/bin/scp *_$toDay.tgz $bakHostDaily`
 
#remove all temp files
`rm -f *-log`
`rm -f *_$toDay.tgz`
 
# Apart from daily, Take a weekly backup on Monday for files which get rotated on weekly basis. 
if [ $toDay == "1" ]; then 
 
		# take backup of messages log file
        if [ -f messages.1 ]; then
                `/bin/tar  czf message_$yesterDate.tgz messages.1`
                `/usr/bin/scp message_$yesterDate.tgz $bakHost`
                `rm -f message_$yesterDate.tgz`
        fi
 
		# take backup of secure log file
        if [ -f secure.1 ]; then
                `/bin/tar  czf secure_$yesterDate.tgz secure.1`
                `/usr/bin/scp secure_$yesterDate.tgz $bakHost`
                `rm -f secure_$yesterDate.tgz`
 
        fi
fi

Again I’m stressing on point that this is a very basic script and doesn’t handle any unforeseen situations like file doesn’t exist or what happens if compression or copying to other server fails etc. You have to do it yourself.

The point of taking backup on weekly basis is that the file combines week log in a single file which is easy to retain. Daily backup files here get overwritten but I want to retain weekly files for longer duration.

Now you should run this script on daily basis through cron at around 4:30am. why 4:30? because the syslog service normally runs at 4:03am daily to rotate log files and you should copy the rotated file if needed.

$crontab -l 
# backup logs to backup server daily
30 4 * * * /bin/bash /root/logBackup/hostlogBackup.sh

That’s all we need to do. Let me know your views about it.

Previous post:

Next post: