I’ve been getting used to using the cloud over the past week or so and I’ve got to say it’s much easier to manage than I had imagined. I know there are tons of great desktop clients to manage your S3 buckets and files, but I wanted something to do all of the management for me – to make things simple.
I’ve put together a script using Undesigned’s awesome PHP S3 class that will scan a directory, compare all of the files in the directory to those in a given directory on your bucket and then upload missing files or replaced unmatched files.
You could set this PHP script up to do a lot more than it does currently, as it was set up to be a quick and dirty solution. Here’s the basis of it. You can download the full script and classes below.
//check to see if the file exists and can return file info if (($info = $s3->getObjectInfo($bucket, $uri)) !== false) { //if the local hash doesnt match the remote hash, upload and replace the file if($info['hash'] <> $local_md5){ print_r('Can\'t match file: '.$the_file.' -- Now uploading'); echo "Putting $the_file . . . at $uri "; //this will upload files with private permissions - you need to set the permissions as //needed by changing S3::PRIVATE to whichever level you need. if($s3->putObject(S3::inputFile($object), $bucket, $uri, S3::ACL_PRIVATE, array(), array( // Custom $requestHeaders "Cache-Control" => "max-age=315360000", "Expires" => gmdate("D, d M Y H:i:s T", strtotime("+5 years")) ))) echo "OK\n"; else echo "ERROR!\n"; } else{ echo "The hash values for $the_file are equal "; } }else{ //file doesn't exist in your bucket, so upload it print_r('Can\'t find that file: '.$the_file.' -- Now uploading'); echo "Putting $the_file . . . at $uri "; if($s3->putObject(S3::inputFile($object), $bucket, $uri, S3::ACL_PUBLIC_READ, array(), array( // Custom $requestHeaders "Cache-Control" => "max-age=315360000", "Expires" => gmdate("D, d M Y H:i:s T", strtotime("+5 years")) ))) echo "OK\n"; else echo "ERROR!\n"; }
Get the Class | Get the sync script |