Carbonite Online Backup – Don’t do it.

I’ve got a lot of data to back up. Lots. I have tried several services and were unhappy with the pace of the backups as well as the lack of reliability.

I landed on Carbonite as my next product to try out. With the volume of data I had to back up, I gave it a try. Initial backups were going at a steady pace. Faster than I expected, really. But after the initial 30 days or so it slowed to a crawl. It would have taken about 18 months to get my initial backup completed and by that time I would have double the data. It was never really going to be able to keep pace and that was a huge deal breaker.

In addition to the slow pace of the initial backup, the actual backup engine software failed to work more than a dozen times, resulting in weeks of absolutely no backups. I had to keep restarting my computer, re-installing the software, and hoping it would just start working. At one point it told me it was going to have to start the backup over again. That was the icing on the cake so I gave up on it. I couldn’t trust it with my data.

This service just wasn’t going to work. Carbonite isn’t a good solution. I emailed support and asked to cancel the account and have a refund of my remaining balance. A bit later I get a response that I need to call them during certain hours to talk to them before I can move forward. I let some time pass before I tried again.

I emailed support once more with the same request: cancel and refund. Same line. They wanted me to call them during certain hours. I responded to let them know that I would not be calling them and that this was a simple request. After a few more email exchanges and confirmations of confirmations, I was finally done with the service. Almost. At no point did they have any intention of giving me a refund on my year subscription that was used for 8 weeks. And they still haven’t given me a refund and only pointed me towards their policy on refunds – they don’t give them.

I cannot recommend Carbonite Online Backup for any real-world use (not to mention, bad business). It would have been one thing if the service just couldn’t perform, but it’s another thing entirely that they refuse to give refunds when someone is completely unsatisfied with the service.

Dynamically Sync Your Files With Amazon S3

I’ve been getting used to using the cloud over the past week or so and I’ve got to say it’s much easier to manage than I had imagined. I know there are tons of great desktop clients to manage your S3 buckets and files, but I wanted something to do all of the management for me – to make things simple.

I’ve put together a script using Undesigned’s awesome PHP S3 class that will scan a directory, compare all of the files in the directory to those in a given directory on your bucket and then upload missing files or replaced unmatched files.

You could set this PHP script up to do a lot more than it does currently, as it was set up to be a quick and dirty solution. Here’s the basis of it. You can download the full script and classes below.

//check to see if the file exists and can return file info
if (($info = $s3->getObjectInfo($bucket, $uri)) !== false) {
//if the local hash doesnt match the remote hash, upload and replace the file
if($info['hash'] <> $local_md5){
print_r('Can\'t match file: '.$the_file.' -- Now uploading');
echo "Putting $the_file . . . at $uri ";
//this will upload files with private permissions - you need to set the permissions as
//needed by changing S3::PRIVATE to whichever level you need.
if($s3->putObject(S3::inputFile($object), $bucket, $uri,  S3::ACL_PRIVATE,
array(),
array( // Custom $requestHeaders
"Cache-Control" => "max-age=315360000",
"Expires" => gmdate("D, d M Y H:i:s T", strtotime("+5 years"))
)))
echo "OK\n";
else
echo "ERROR!\n";
}
else{
echo "The hash values for $the_file are equal ";
}
}else{
//file doesn't exist in your bucket, so upload it
print_r('Can\'t find that file: '.$the_file.' -- Now uploading');
echo "Putting $the_file . . . at $uri ";
if($s3->putObject(S3::inputFile($object), $bucket, $uri,  S3::ACL_PUBLIC_READ,
array(),
array( // Custom $requestHeaders
"Cache-Control" => "max-age=315360000",
"Expires" => gmdate("D, d M Y H:i:s T", strtotime("+5 years"))
)))
echo "OK\n";
else
echo "ERROR!\n";
}
Get the Class Get the sync script