Jump to content

Speed up page loading significantly


BUCKHORN
 Share

Recommended Posts

Here's an example of the performance boost I gained by doing this...

My page was 26kb before, and 4kb after. Page loading time was reduced by almost half.

In site/config.php uncomment $config->prependTemplateFile and $config->appendTemplateFile = '_out.php';

/**
 * prependTemplateFile: PHP file in /site/templates/ that will be loaded before each page's template file
 *
 * Uncomment and edit to enable.
 *
 */
 $config->prependTemplateFile = '_in.php'; //you can name this file whatever you want

/**
 * appendTemplateFile: PHP file in /site/templates/ that will be loaded after each page's template file
 *
 * Uncomment and edit to enable.
 *
 */
 $config->appendTemplateFile = '_out.php'; //you can name this file whatever you want

Create two new files in site/templates called _in.php and _out.php.

In _in.php add the following at the top of the file (before any code that you add later)

<?php

if (!in_array('ob_gzhandler', ob_list_handlers())) {
	ob_start('ob_gzhandler');
} else {
	ob_start();
}
?>

In _out.php add the following at the bottom (after any code you add later)

<?php ob_end_flush(); ?>
  • Like 8
Link to comment
Share on other sites

Good job! Thanks for the tip.

Please note, people, that before implementing this you must check your PHP zlib configuration.

Because the official PHP documentation states that

you cannot use both ob_gzhandler() and zlib.output_compression.

To check, either run 

php -i | grep zlib.output_compression 

if you are on a console, or do the usual phpinfo() thing and search for zlib.output_compression.

To note the obvious: if you are using PW 2.3.0, the default file names are:

$config->prependTemplateFile = '_init.php'; 

and

$config->appendTemplateFile = '_done.php'; 

With the almost untouched default site profile, I got 34 kbytes without uncompression vs 8 kbytes with compression (page /about/; traffic measured with Comodo Dragon (based on Chromium). Screenshots are attached).

One last remark: the PHP docs recommend turning on zlib.output_compression. Which is understandable once you start taking PW caching into account.

post-956-0-54871100-1379324773_thumb.png

post-956-0-07830600-1379324786_thumb.png

  • Like 4
Link to comment
Share on other sites

Thanks, that's a good point. 

I also wondered if changing the default names would be confusing to some, so I appreciate you listing them.

Here's another approach which would cover the scenario in which the pages are cached to a static file.

To use mod_rewrite to handle gzip, you'd add something like this to config or htaccess:
 
<IfModule mod_rewrite.c>
    RewriteEngine On
    RewriteCond %{HTTP:Accept-Encoding} gzip
    RewriteCond %{REQUEST_FILENAME}.gz -f
    RewriteRule (.*\.(js|css))$ $1.gz [L]
</IfModule>

AddEncoding x-gzip .gz

<FilesMatch .*\.css.gz>
    ForceType text/css
</FilesMatch>

<FilesMatch .*\.js.gz>
    ForceType application/x-javascript
</FilesMatch>

http://www.cravediy.com/59-Simple-gzip-Support-for-Apache-with-mod_rewrite.html

Link to comment
Share on other sites

<?php

if ((!ini_get('zlib.output_compression')) && (!in_array('ob_gzhandler', ob_list_handlers()))) {
	ob_start('ob_gzhandler');
} else {
	ob_start();
}
?>

This is a minor improvement to take into account Valery's comment above. It checks to see if php zlib.output_compression is enabled. It doesn't check to see if Apache is using compression. This may be a solution to that: http://stackoverflow.com/questions/9397295/how-to-check-if-gzip-compression-is-enabled-with-php

Link to comment
Share on other sites

I'm not an expert on this, so correct me if I'm wrong, but wouldn't you be better off doing this with htaccess? This post by WillyC seems to have some good suggestions in that regard. But it seems like there would be real downsides to doing this with PHP rather than htaccess. For example, if using PW's template cache or ProCache, combined with ob_gzhandler in PHP, your cache files would end up gzipped and presumably re-gzipped when delivered. Or if PHP is smart enough not to re-gzip something then it seems likely that clients that don't support gzip could still get cached gzipped files? We could probably get around these issues by providing ob_gzhandler to ProcessWire's root ob_start() call, but I'd be curious to know the benefits of doing this with PHP rather than Apache, before suggesting that as a configuration option. I had always assumed this was something best left to Apache. Apache can do the same with your static assets too (CSS, JS, etc.) 

Link to comment
Share on other sites

It's been awhile since I've worked with page compression, but here's what I remember.

This checks to see if content is being gzipped by looking at currently loaded handlers and checking for zend compression (php isn't smart enough). If their is already a gzipped buffer it fires ob_start, if not php will start a gzipped buffer if the client supports compression, if not the output is not compressed (gz_handler is smart like that). 

if ((!ini_get('zlib.output_compression')) && (!in_array('ob_gzhandler', ob_list_handlers()))) {
    ob_start('ob_gzhandler');
} else {
    ob_start();
}

As far as dealing with cached gzipped files, I wonder if you'll gain some performance by saving compressed data. There are php functions to dealing with compression, that will compress and decompress gzipped data. You'd handle this in the same way you would serialize data. 

A quick and dirty process flow would be something like this (keep in mind I haven't used the pro cache module nor do I understand yet how the default caching works)...

  • save compressed data to file
  • on the next page request
  • start gzip if it's supported by the client
  • else start the regular buffer
  • route the request and get the cache file
  • decompress and unserialize the cache file and dump into the current buffer. Data is uncompressed at this point and only compressed back if gzip is supported by client. (I wonder if it's possible to dump a compressed cache file into the current buffer and only decompress if the client doesn't support compression.) 
  • flush the buffer and display the page

As far as advantages, there are probably less for PW. I see using gzip with php as an advantage in distributable applications targeting users w/ little to know programming or server admin knowledge. For those folks on shared hosting who want to install and go without hassle, this is a good option for them. It provides a big boost in performance w/o the additional configuration of apache or apache modules. Perhaps another advantage is being less dependent on the server environment and knowledge of the end user and controlling more aspects of how the app works inside of the app. 

All of that being said, in most cases Apache will be a better choice followed by zlib compression. Apache will also compress the js and css. The disadvantage is not all of your community will be running mod_gzip and may have no option to enable it (shared hosts). A config checkbox would allow advanced users to turn it on/off php compression if they wanted to use other methods like apache or zlib. Even if you don't use compression with the output buffer, there could be enough of a performance boost to consider using it to compress your template files before saving them. Here's how to retrieve data Unserialize(gzuncompress(file_get_contents())).

  • Like 1
Link to comment
Share on other sites

While this technic is nice to know it is much easier to enable compression via .htaccess, also because of the reason Ryan mentions. I don't see any reason why to do this via PHP. See the H5BP htaccess for nice technic https://github.com/h5bp/html5-boilerplate/blob/master/.htaccess#L379 and other goodies.

Also only while the html served might be 4kb instead of 24kb, this doesn't make the page load that much faster, but using this for all resources along with caching headers you get a ~30% decrease of file sizes. 

Using PW template cache can have the most impact on page loading, depending on how render intensive the page is, you get a loading time around factor 2-5 times faster just because it doesn't need to render it.

  • Like 3
Link to comment
Share on other sites

I do agree, if you have Apache and the technical know how to pull this off, then using Apache is the best method, because it will compress your css, js and other files (though you can compress js/css using php). The only problem with this approach is we are making the assumption that Apache is the only web server in existence and that users have the ability to enable and configure the Apache modules that are required to pull this off. 

Also, in many cases, these types of configurations should go into the vhost.conf for performance reasons. Vhost.conf will be read once when Apache restarts vs. .htaccess which will be read on each and every page request. 

The second part of the conversation is more interesting to me though. Will saving compressed data in the form of cache files be faster than saving uncompressed data? This is where I can see some potential value for using compression with PW.

$compressed_cache = gzencode($data);
// if you need to serialize use gzencode(serialize($data))
// for zlib compression use gzcompress($data) 

The when you are retrieving the cache file you can uncompress it...

$value = gzdecode(file_get_contents());
// if you need to unserialize use unserialize(gzencode(file_get_contents())) 
// for zlib compression use gzuncompress(file_get_contents())

Or send it compressed if the browser supports this...

if (substr_count($_SERVER['HTTP_ACCEPT_ENCODING'], 'gzip')) {
    $value = file_get_contents();
} else {
    $value = gzdecode(file_get_contents());
}
  • Like 1
Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
 Share

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...