In the last blog I wrote I detailed how to send backups directly into S3. You might want to send backups to S3 rather than doing snapshots of your block devices in EC2 so that you can later download those backups and keep some form of your data in house. In this blog detail a basic script that does just that.
Now in this script I don't really care to download all the backups in my S3 bucket, merely the most recent. So what I'll do is parse the files in the bucket looking for the newest and then download that file.
Once again, to use these scripts first you must install the aws-sdk ruby gem from EPEL.
# Install EPEL
wget http://dl.fedoraproject.org/pub/epel/6/x86_64/epel-release-6-8.noarch.rpm
wget http://rpms.famillecollet.com/enterprise/remi-release-6.rpm
sudo rpm -Uvh remi-release-6*.rpm epel-release-6*.rpm
# Install Ruby
yum install rubygems ruby rubygem-nokogiri rubygem-aws-sdk
Next, here's the actual backup script.
#!/usr/bin/env ruby
# Written by Robert Birnie
# Source: http://www.uberobert.com/download-s3-backups/
require 'rubygems'
require 'aws-sdk'
AWS.config(
:access_key_id => '*** Provide your access key ***',
:secret_access_key => '*** Provide your secret key ***'
)
# Backup Settings
mysql_path = '/backups/web/mysql'
mysql_bucket = 'mysql-backups'
www_path = '/backups/web/www'
www_bucket = 'www-backups'
# Logging
# Monkey patch logger to remove header, else this would signal an alert.
class Logger::LogDevice
def add_log_header(file)
end
end
# Remove old log
File.delete('/var/log/s3_backuperr.log')
# Create new log
@log = Logger.new('/var/log/s3_backuperr.log')
@log.level = Logger::WARN
def s3_download(file_name, base, bucket)
# Get an instance of the S3 interface.
s3 = AWS::S3.new
# Upload backup file.
key = File.basename(file_name)
puts "Downloading file #{file_name} from bucket #{bucket}."
File.open("#{base}/#{file_name}", 'wb') do |file|
s3.buckets[bucket].objects[key].read do |chunk|
file.write(chunk)
end
end
end
def newest_file(bucket_name)
files = Hash.new
s3 = AWS::S3.new
bucket = s3.buckets[bucket_name]
bucket.objects.each do |obj|
files[obj.last_modified] = obj.key
end
files.max[1]
end
def backup(bucket, path)
begin
file = newest_file(bucket)
unless file.empty? or File.exists? "#{path}/#{file}"
puts "downloading #{file}"
s3_download(file, path, bucket)
end
rescue Exception => e
@log.error "Issue with backups from #{bucket}"
@log.error e
raise e
end
end
backup(mysql_bucket, mysql_path)
backup(www_bucket, www_path)
And that should be about it!
Edit Mar 27, 2014
I rewrote some of the script to add some logging. This log is watched in an icinga/nagios check to see how backups are going. I use the mtime of the file to tell if backups are running, and any text in the file I count as an error. Hence why I delete the old file each run.