Jump to content

Server Side Includes


LostKobrakai
 Share

Recommended Posts

I used SSI way back when hosting packages with PHP were too much for small sites but you could get small packages for a Euro a month that supported SSI and Perl CGIs. I haven't found a use for them in years though, as they're really limited and, if you're already working in a PHP environment, tend to break apart code and template logic. So I don't have much to add here, but I'm curious to hear about practical use cases.

Link to comment
Share on other sites

I have tried it out 1 or 2 years back on an apache server in comparision to PHP. The use case was, that I used ProCache there for an archive with 2k+ pages that, once created, never will be changed. But every page has had embedded a section with a register of names as links, what continuosly should grow. I thought and tried if it would be possible to use SSI to include this section into the cached html pages. But for me it worked out that SSI was much slower than PHP on this server.

Link to comment
Share on other sites

varnish ESI with processwire sounds like a fun field to experiment with. makes a lot of sense if you have high-traffic sites with dynamic (i.e. user-generated or personalized) content which is putting quite some load on the server, and static segements that never change. only thing you need is SSL offloading, as varnish doesn't do SSL. Also, in HTTP/2 environments, I'm pretty sure it will cause some funny side effects.

Link to comment
Share on other sites

varnish ESI with processwire sounds like a fun field to experiment with. makes a lot of sense if you have high-traffic sites with dynamic (i.e. user-generated or personalized) content which is putting quite some load on the server, and static segements that never change. only thing you need is SSL offloading, as varnish doesn't do SSL. Also, in HTTP/2 environments, I'm pretty sure it will cause some funny side effects.

There's also a danger of decreasing perfomance instead of the opposite if you aren't careful with includes, as you may be introducing round trips to the content server or trading a few lines of PHP code for separate request parsing and additional file IO. Even if you speed up nine out of ten requests, the one that has to fetch every dynamic part from scratch may simply take too long in the user's eyes. Thus, Varnish ESI / NGINX SSI usually come hand in hand with in-memory caching, and if you plan for really high traffic, you'll want to make that cache distributed and add some kind of boot-up script to fill (most of) the memory cache in case of a purge (this is a point where I realized how convenient elastic cloud servers could be in heavy load environments, as you could fire up a copy of your PW server and run the boot-up script there without slowing down the life instance).

I've been thinking about approaches there for a while as I'm working on my forum project, and it can be quite a headache. There's all kinds of partial, dynamic content like forum statistics, login boxes, thread updates, last posting lists or PM information, and each has different requirements regarding actuality and load impact. Add different levels of visiblity to that (i.e. through forum roles or user settings) and the flow chart becomes one of the feared many-legged spiders ;) Expiry of memory-cached contents is another minefield in itself.

I guess one generic rule, for PW as for every other web application, is that the right key already brings you half the way to the goal. If you can have your backend (PW) fill the memory cache with dynamic content and either pass the right key in a cookie to the user, use a unique URL or use GET/POST parameters to form the key, the cache can fetch that content easily. Of course, not every content is easily cacheable, and sometimes it may make sense to keep the memory footprint of cached content small by breaking it up so one doesn't end up trying to cache every single one of a thousand possibilities for every single user. In NGINX, this could e.g. mean that, instead of caching the complete logout box for each user, to render it once into memcache with placeholders and letting NGINX substitute ("echo") the necessary keywords it has retrieved through earlier calls to the memory cache (and stored the results with "set").

That's just a small brain dump and mostly theoretical, but it is an interesting topic and one I'll have to face sooner or later, so if anybody has practical experience, it would be interesting to hear about that.

  • Like 1
Link to comment
Share on other sites

As far as I understood Varnish, it dismisses all kinds of cookies by default. Makes sense :)

There's also a danger of decreasing perfomance instead of the opposite if you aren't careful with includes, as you may be introducing round trips to the content server or trading a few lines of PHP code for separate request parsing and additional file IO. Even if you speed up nine out of ten requests, the one that has to fetch every dynamic part from scratch may simply take too long in the user's eyes.

Ok, given you get 90% of your content from the cache (e.g. varnish), then the backend would have WAY more headroom to quickly process and deliver the dynamic results. The additional round-trip delay should add up to somewhat below 4ms, and given that you need to do some processing and / or DB queries in the background, it's negligible.

I think whether or not you want to use SSI / ESI depends a lot on your caching strategy which depends a lot on the architecture. I see the benefits and the excitement, but you should always ask yourself if the extra effort is worth it. Rule of thumb: the more visitors you have, the more you want a cache and the more efficient the cache is. If you have one visitor every 5 minutes (approx. 200/day), then you have to use long cache lifetimes and still every 2nd user experiences small delay as the cache needs to be revalidated and updated. Just to get stale after 5 minutes, and not a single visitor was seen this time. Depending on your hardware, 5000/day shouldn't cause any problems – IF they come well distributed and not all of them in 10 minutes. Still that shouldn't be a problem, given that you know how to set up your web server and DB server for this kind of traffic. It will be probably very slow however. If you have this kind of scenario, a cache would still make sense and save your a** for these very 10 minutes ;)

Link to comment
Share on other sites

I've a page which does get the hits of around 25000 users on a single day, peaking at a specific time and my question was mostly in regards of the csrf topic, see here: https://www.fastly.com/blog/caching-uncacheable-csrf-security ESI is one of the options besides using a cookie or even a additional ajax request.

Link to comment
Share on other sites

 Share

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...