Dynamically Sync Your Files With Amazon S3

I’ve been getting used to using the cloud over the past week or so and I’ve got to say it’s much easier to manage than I had imagined. I know there are tons of great desktop clients to manage your S3 buckets and files, but I wanted something to do all of the management for me – to make things simple.

I’ve put together a script using Undesigned’s awesome PHP S3 class that will scan a directory, compare all of the files in the directory to those in a given directory on your bucket and then upload missing files or replaced unmatched files.

You could set this PHP script up to do a lot more than it does currently, as it was set up to be a quick and dirty solution. Here’s the basis of it. You can download the full script and classes below.

//check to see if the file exists and can return file info
if (($info = $s3->getObjectInfo($bucket, $uri)) !== false) {
//if the local hash doesnt match the remote hash, upload and replace the file
if($info['hash'] <> $local_md5){
print_r('Can\'t match file: '.$the_file.' -- Now uploading');
echo "Putting $the_file . . . at $uri ";
//this will upload files with private permissions - you need to set the permissions as
//needed by changing S3::PRIVATE to whichever level you need.
if($s3->putObject(S3::inputFile($object), $bucket, $uri,  S3::ACL_PRIVATE,
array(),
array( // Custom $requestHeaders
"Cache-Control" => "max-age=315360000",
"Expires" => gmdate("D, d M Y H:i:s T", strtotime("+5 years"))
)))
echo "OK\n";
else
echo "ERROR!\n";
}
else{
echo "The hash values for $the_file are equal ";
}
}else{
//file doesn't exist in your bucket, so upload it
print_r('Can\'t find that file: '.$the_file.' -- Now uploading');
echo "Putting $the_file . . . at $uri ";
if($s3->putObject(S3::inputFile($object), $bucket, $uri,  S3::ACL_PUBLIC_READ,
array(),
array( // Custom $requestHeaders
"Cache-Control" => "max-age=315360000",
"Expires" => gmdate("D, d M Y H:i:s T", strtotime("+5 years"))
)))
echo "OK\n";
else
echo "ERROR!\n";
}
Get the Class Get the sync script

Head in the clouds: Amazon s3 and CloudFront with Joomla! and PHP

I’m managing a LAMP server on a virtual dedicated setup and have recently run into some performance issues. During periods of higher user visits the CPU usage shoots up This could be partly because of the platform running on it, but I’m not sure that the PHP scripting is the root cause.

To alleviate some of my concerns, I’ve moved most of the static images, css, and javascript over to Amazon s3 and set up a CDN for that data with CloudFront. It’s fast. Really fast compared to what we’re used to. It has drastically helped load times and from what I can tell it is improving the performance of the site altogether by reducing the load on the server.

Caching is out in full force for the main joomla site while the content being served from amazon s3 is gzipped, minified, and speedy. I’m still working through YSlow to get things even more streamlined, but I’m more satisfied now than before.

I was using a minify component to grab scripts on-the-fly and combine them but that’s just a nightmare for performance so I’m going to revamp and get the remainder of those files out to the CDN and compressed.

For those who are interested, I’m serving gzipped js and css but using some varied methods also depending on if the connection is secure or not. Since S3 does not function over https using cnames, I had to set up a check to see which URL to use for my CDN files.

Something like this did the trick for my Joomla! template (placed in the template’s index.php):

//check to see if the browser supports gzip content
$gz = strpos($_SERVER['HTTP_ACCEPT_ENCODING'], 'gzip') !== false ? 'gz' : '';
if ($_SERVER['HTTPS'] != "on") {
//insert your code to execute over http here
$document->addStyleSheet('https://yourbucketname.s3.amazonaws.com/templates/css/style.'.$gz.'css');
}
else{
//enter your https code here
$document->addStyleSheet('http://cdn.yoursite.com/templates/css/style.'.$gz.'css');
}

I’ve got this loading the CNAME urls such as cdn.yourdomain.com/css/style.gzcss for the non-secure connection but using the secure URLS that amazon provides for the secure connection.

And no, style.gzcss is not a typo. There were issues with how the browser was grabbing files with the regular style.gz.css extension, even when I had set the content-encoding to gzip and the content-type to text/css. This has worked for me, but there may be a more elegant solution floating around. Cloudberry explorer has been a great asset in doing this quickly and efficiently.