DirectoryPress Broken CSV export

If you use DirectoryPress 7.0.9 you may have noticed that the CSV export function isn’t working properly. This is going to be a quick “edit this here” fix.

Edit the admin-save.php file in your /wp-content/themes/directorypress/admin directory.

Find around line 1105 or so. It should have:

$dat = array_merge($dat, $FF);

Change that to:

$dat = $dat + $FF;

Save it and upload it to your server. It should now be able to export properly.

Parse error: syntax error, unexpected $end in /app/views/layouts/default.ctp – CakePHP

I just encountered an unexpected error while creating a portable development server on a USB stick using the latest version of XAMPP portable. This particular application runs on the CakePHP framework and was copied directly from another functioning Windows based machine.

After scratching my head a few moments I realized that the default configuration for XAMPP’s php.ini may be different.

The fix is easy. Go into your php.ini for your php installation and change the following from ‘OFF’ to ‘On’:

short_open_tag = on

Then simply go and restart your apache to apply the changes.

PHP Fatal error: Call to undefined function curl_init()

I recently set up a development server in a VirtualBox VM running Ubuntu server 11.04. My plan was to move a development database and website to this VM and migrate away from a local XAMPP installation on a Windows box. The only problem is that Apache and PHP were not exactly the same between the two systems. This resulted in some of my scripts not working correctly. More specifically, I was getting the “PHP Fatal error:  Call to undefined function curl_init()” error. On my Ubuntu VM.

This likely meant that curl was not installed or enabled on Ubuntu, so here’s what I did:

  1. Run this command from a terminal:
    • sudo apt-get install curl libcurl3 libcurl3-dev php5-curl
  2. Make sure curl is enabled in your php.ini. This file is usually in “/etc/php5/apache2/php.ini”
    • In the section for dynamic extensions add (or uncomment):
      • extension=php_curl.so
  3. Restart Apache:
    • sudo /etc/init.d/apache2 restart

How To: CakePHP form input without labels

One of the great things about CakePHP is how easy it is to get very common tasks accomplished in a short amount of time. Form building is no exception, but there are many options to specify – and at times those options are not immediately apparent.

The standard input code syntax appears like this:

input(string $fieldName, array $options = array())

To have your form inputs not show labels, you can specify ‘label’=>false in the input options.

For example:

<?php
echo $form->create(‘User’, array(‘action’ => ‘login’, ‘class’=>’header-login’));
echo $form->input(‘username’, array(‘label’ => false));
echo $form->input(‘password’, array(‘label’ => false));
echo $form->end(‘Login’);
?>

This will show your form inputs without the label elements.

Dynamically Sync Your Files With Amazon S3

I’ve been getting used to using the cloud over the past week or so and I’ve got to say it’s much easier to manage than I had imagined. I know there are tons of great desktop clients to manage your S3 buckets and files, but I wanted something to do all of the management for me – to make things simple.

I’ve put together a script using Undesigned’s awesome PHP S3 class that will scan a directory, compare all of the files in the directory to those in a given directory on your bucket and then upload missing files or replaced unmatched files.

You could set this PHP script up to do a lot more than it does currently, as it was set up to be a quick and dirty solution. Here’s the basis of it. You can download the full script and classes below.

//check to see if the file exists and can return file info
if (($info = $s3-&gt;getObjectInfo($bucket, $uri)) !== false) {
//if the local hash doesnt match the remote hash, upload and replace the file
if($info['hash'] &lt;&gt; $local_md5){
print_r('Can\'t match file: '.$the_file.' -- Now uploading');
echo "Putting $the_file . . . at $uri ";
//this will upload files with private permissions - you need to set the permissions as
//needed by changing S3::PRIVATE to whichever level you need.
if($s3-&gt;putObject(S3::inputFile($object), $bucket, $uri,  S3::ACL_PRIVATE,
array(),
array( // Custom $requestHeaders
"Cache-Control" =&gt; "max-age=315360000",
"Expires" =&gt; gmdate("D, d M Y H:i:s T", strtotime("+5 years"))
)))
echo "OK\n";
else
echo "ERROR!\n";
}
else{
echo "The hash values for $the_file are equal ";
}
}else{
//file doesn't exist in your bucket, so upload it
print_r('Can\'t find that file: '.$the_file.' -- Now uploading');
echo "Putting $the_file . . . at $uri ";
if($s3-&gt;putObject(S3::inputFile($object), $bucket, $uri,  S3::ACL_PUBLIC_READ,
array(),
array( // Custom $requestHeaders
"Cache-Control" =&gt; "max-age=315360000",
"Expires" =&gt; gmdate("D, d M Y H:i:s T", strtotime("+5 years"))
)))
echo "OK\n";
else
echo "ERROR!\n";
}
Get the Class Get the sync script

Head in the clouds: Amazon s3 and CloudFront with Joomla! and PHP

I’m managing a LAMP server on a virtual dedicated setup and have recently run into some performance issues. During periods of higher user visits the CPU usage shoots up This could be partly because of the platform running on it, but I’m not sure that the PHP scripting is the root cause.

To alleviate some of my concerns, I’ve moved most of the static images, css, and javascript over to Amazon s3 and set up a CDN for that data with CloudFront. It’s fast. Really fast compared to what we’re used to. It has drastically helped load times and from what I can tell it is improving the performance of the site altogether by reducing the load on the server.

Caching is out in full force for the main joomla site while the content being served from amazon s3 is gzipped, minified, and speedy. I’m still working through YSlow to get things even more streamlined, but I’m more satisfied now than before.

I was using a minify component to grab scripts on-the-fly and combine them but that’s just a nightmare for performance so I’m going to revamp and get the remainder of those files out to the CDN and compressed.

For those who are interested, I’m serving gzipped js and css but using some varied methods also depending on if the connection is secure or not. Since S3 does not function over https using cnames, I had to set up a check to see which URL to use for my CDN files.

Something like this did the trick for my Joomla! template (placed in the template’s index.php):

//check to see if the browser supports gzip content
$gz = strpos($_SERVER['HTTP_ACCEPT_ENCODING'], 'gzip') !== false ? 'gz' : '';
if ($_SERVER['HTTPS'] != "on") {
//insert your code to execute over http here
$document-&gt;addStyleSheet('https://yourbucketname.s3.amazonaws.com/templates/css/style.'.$gz.'css');
}
else{
//enter your https code here
$document-&gt;addStyleSheet('http://cdn.yoursite.com/templates/css/style.'.$gz.'css');
}

I’ve got this loading the CNAME urls such as cdn.yourdomain.com/css/style.gzcss for the non-secure connection but using the secure URLS that amazon provides for the secure connection.

And no, style.gzcss is not a typo. There were issues with how the browser was grabbing files with the regular style.gz.css extension, even when I had set the content-encoding to gzip and the content-type to text/css. This has worked for me, but there may be a more elegant solution floating around. Cloudberry explorer has been a great asset in doing this quickly and efficiently.