Tomcat log backup

Following script can be used to archive the log files in tomcat log directory.

This script…

  • Archives all the files in logs directory to bkup directory
  • Deletes all the files older than ’n’ days in bkup directory
#!/bin/bash

arch_logs() {
    for file_pattern; do
        i=1
        for log_file in $(find $file_pattern | sort -r); do
            test $i -eq 1 && ((i=i+1)) && continue;
            sudo gzip "$log_file"
            sudo mv *.gz ./bkup/
        done
    done
}

# change directory to tomcat log directory
cd /local/tomcat/logs

# Create backup directory if it does not exists
mkdir -p bkup

# Gzip the logs except the current fil matching the following pattern
arch_logs 'catalina.2*' 'host-manager.*' 'localhost.*' 'localhost_access_log.*' 'manager.*'

# Call arch_logs if you have any other files in different directory with appropriate pattern
# ...

# change directory to bkup log directory
cd ./bkup

# Delete the file older than 30 days from the bkup directory
backup_days=30
find . -name "*.gz" -mtime +$backup_days -exec sudo rm {} \;

exit 0

Schedule to call this script (e.g. everyday @ 1 AM)

sudo crontab -e

Add the following line at the end of the cron file to schedule it Change the path of the script as appropriate

0 1 * * * /local/tomcat/tomcat-bkup-log.sh

Instead of deleting the file you can upload the logs to s3 bucket using AWS CLI.

s3_path=s3://your-s3-domain/your-path/`date --date yesterday +%F`/
sudo aws s3 --region ap-southeast-1 cp bkup $s3_path --recursive --exclude "*" --include "*.gz"

Note: This script works correctly only if the log file contains the date in format like ‘yyyy-mm-dd’. This format ensures recent file not archived. e.g.

if you have files likes this

  • my-app-2017-01-22.log
  • my-app-2017.01-23.log
  • my-app-2017-01-24.log

then you call archive log function within script as follows

arch_logs 'my-app-*.log'