Jump to content

Lots of images to resize -> timeout in frontend...


titanium
 Share

Recommended Posts

Dear forum members,

I implemented a very basic picture gallery in a client's website. Basically all images are uploaded to an image field. The pictures are resized and brought to output like this:

foreach ($page->images as $image) {
	$output .= '<a href="' . $image->width(1000)->url . '" title="' . $image->description . '"><img src="' . $image->size(100, 100)->url . '" height="100" width="100"></a>';
}

I think that's common practice so far and it works nicely with a limited number of images.

Since the client is uploading a lot of images, the page loading times in the frontend exceed the webserver's processing timeout limit of 30 seconds. The browser displays a script timeout error then.

Is there anything I can do to avoid this? Setting up the script timeout is not an option, because it doesn't scale. I think the best way would be to resize the images to the two formats (100x100 pixels and 1000 x proportional height pixels) right at upload. I guess a module for this could be handy. Has anyone did that already or could give me a hint in the right direction?

Thanks a lot!

Link to comment
Share on other sites

Hi titanium

You should try the following steps:

  1. Make sure the images are getting resized after uploading. Doesn't need to be the exact size of the thumbnail, but a reasonable resolution for web eg. 1280x1024 max. depending on your needs. This option is in the images fields settings.
  2. Maybe increase PHP's memory_limit for faster image processing.

@adrian

Problem is that when the client uploads monster-images (high resolution, big filesize), it takes time to make the thumbnails. If this time passes the 30 seconds set with max_execution_time defined in php.ini, you'll get that error.

Link to comment
Share on other sites

This module should take care of what you need:

http://mods.pw/1b

Thanks adrian, I already tried it. The thumbnails module does the right thing when it comes to images with a certain width AND height. That's true for the thumbnails (100 x 100 pixels), but for the detail view I would like to use a proportional height - this is not possible with the thumbnails module currently.

Link to comment
Share on other sites

One way is to tell the client to reduce the images size. I think we shouldn't try to solve everything, and put a minimum of responsability on those that edit the website. Just indicate them an easy tool to install on their computer http://www.addictivetips.com/windows-tips/image-resizer-powertoy-clone-for-windows-7vista/

  • Like 2
Link to comment
Share on other sites

@Wanze,

I wasn't thinking about the first time the front end page was viewed. I guess it is the view of a page with lots of new thumbnails at once that is causing the timeout problem.

@Titanium

I thought I mentioned the idea of making a proportional option for the thumbnails module somewhere, but can't seem to find the post. Seems to me like it would be very useful and not hard to implement.

@diogo

I agree, but experience tells me that people are lazy and also can't follow instructions :)

  • Like 1
Link to comment
Share on other sites

You may also try to set_time_limit(30) in your foreach-loop.

foreach ($page->images as $image) {
	set_time_limit(30);
	$output .= '<a href="' . $image->width(1000)->url . '" title="' . $image->description . '"> ... </a>';
}

If your site are not running in safe_mode and your hostservice hasn't set this function on a blacklist, every image has a max of 30 seconds to get processed.

BUT BETTER: the client resizes images before upload, as Diogo said!

  • Like 2
Link to comment
Share on other sites

This is the only way I know to create different sizes when uploading images:

<?php
class ImageCreateThumbs extends WireData implements Module {

    public static function getModuleInfo() {
        return array(
            'title' => 'ImageCreateThumbs',
            'version' => 100,
            'summary' => '',
            'href' => '',
            'singular' => true,
            'autoload' => true
            );
    }

    public function init() {
        $this->addHookAfter('InputfieldFile::fileAdded', $this, 'sizeImage');
    }

    public function sizeImage($event) {
        $inputfield = $event->object;
        if($inputfield->name != 'images') return; // we assume images field
        $image = $event->argumentsByName("pagefile");
        $image->size(120,120);
        $image->size(1000,0);
    }
}

https://gist.github.com/5685631

What I don't know is if it really makes a difference (I guess), and if using drag and drop ajax upload and old school upload would process image one by one.

My suggestion is to also upload image already in 1000x* pixels because if they upload 3000+ px images it will just takes a lot longer.

  • Like 13
Link to comment
Share on other sites

Does it work, means is there no timeout? I just removed a line that was not needed.

Yes, it definitely works. There is no timeout, the page loads instantly. I did test it with 52 images with a total of 292 MB (the images are huge), it works as desired.

One thing I don't understand is: I thought it would be clever to use the ID of the field instead of the name:

if ($field->id != 146) {
  return;
}

I assumed it would be better for me to use the id, because I tend to rename the fields sometimes and using the ID saves me from hunting down the changed name everywhere :-)

But that doesn't work (146 is the id of my "images" field for sure) - if I do this, the images aren't resized anymore. I even don't know how I can debug the module. I tried

var_dump($field);
break;

and

var_dump($field);
exit;

but had no luck.

Link to comment
Share on other sites

A field's name is almost like an id and it shouldn't be changed. This might happen when you start but once you setup your fields they most likely also are referenced by their name in template files or other settings and changing the name will break your code or output.

In a perfect world this module maybe would have some configuration you could define the field by reference.

Using the id here you'd have to get the field first. I changed the code var $field to $inputfield, to avoid confusion as it's actually the inputfield and not the field (as the hook already suggests). Inputfields have no id. They serve mostly as the input that interacts with the user. If you get the id of the inputfield you'll get the Inputfieldimage_fieldname, which is used for the markup when it's rendered. It's the fieldtype that is the one that has a id property, the id you see when editing a field and in DB.

To debug the module it's a little special, but it can be easily done by using a simple trick. Add this in the hook function

echo $inputfield->id;
exit();

or 

die("id:" . $inputfield->id);

Now if you upload an image it will stop there and you should see what it outputs. If you're using ajax upload, this is seen in the ajax request response if you look in the developer tool. It will show "InputfieldImage_yourfieldname".

Now to get the actual field you get it via the "name" of the field using $fields API.

$fieldID = $this->fields->get($inputfield->name)->id; 
  • Like 1
Link to comment
Share on other sites

  • 4 months later...

Sorry to revive this old topic:

I was wondering if anybody knows what set_time_limit() actually does, when put in the image loop?

If you add e.g. set_time_limit(8) to each of the image loops, what happens if that 8 seconds is not "enough"? I suppose the script will stop, and when the page is reloaded it would continue from where it stopped? So adding the time limit wouldn't have any adverse effects on site performance once the script has run through the initial load, right?

Cheers

Phil

Link to comment
Share on other sites

@phil_s: set_time_limit($n) sets the maximum amount of seconds the script may run before it get interrupted!

If you set it to zero set_time_limit(0) it may run forever if your server isn't setup to avoid this.

But you also don't want to have a script to run forever, because you loose control over it.

Therefor it is a good choice to use it with a little amount of seconds in a loop, because it sets the max amount to live with each call again.

For example, if the default php server setup for timeout is set to 30 seconds and you have 50 images to process whereas each image will take 2 seconds, your script will crash after the 15th image.

BUT, if you call set_time_limit(15) within the foreach loop, all 50 images get processed and the script will run for 100 seconds, - if everything is ok.

If the script run into problems with a single image and hangs, it get interuppted 15 seconds after it started to process the damaged image.

Conclusion: with this approach you can exceed the default setting a script may run without loosing control. :)

  • Like 2
Link to comment
Share on other sites

Going back to the original post, I think there was some confusion in the posts that followed as to what the problem actually was.

My understanding was that the images had uploaded fine, but it was when displaying the actual gallery page that the timeout occurred. This is because the first time an image is displayed on a page at a new size specified by the template file it creates that size image and saves to disk. The penalty is that the first person viewing the page gets the slow loading time whilst all the images are resized on the fly, but in this case it sounds like a lot of images were in the gallery and it caused the timeout as it tried to create the new image simultaneously.

The obvious benefit is that the system only creates these newly resized images as required, and once they're created once that's it - they don't need to be created again unless someone decides to change the size again in the template. Also, it's most likely - as in this case - that the first person to view a page and see the slow loading time is the person who just created the page and uploaded the images, so it's not going to be something that other visitors would generally encounter.

It does sound like a bit of a bottleneck though with large numbers of images in a gallery, but one solution is to set the image field's maximum dimensions (think that was suggested above) so that as images are uploaded to the server they are resized if they're over certain dimensions. For most stuff on the web I set this to 1600x1200. Sure, less tech-savvy users will still try to upload photos from cameras etc where the resolution is something like 5000x2600 and endure slow uploads, but these will then be resized immediately by the max dimensions of the field itself and it should hopefully help that first page load where it has to create whatever image size is set in the template itself.

I'm wondering if a solution to the original problem might be for the upload routine to scan the template for any $image->size(x,y) calls and resize them as they upload? Since each upload using the AJAX uploader is a separate request to the server it shouldn't technically timeout then.

  • Like 1
Link to comment
Share on other sites

 Going back to the original post, I think there was some confusion in the posts that followed as to what the problem actually was.

??

My understanding was that the images had uploaded fine, but it was when displaying the actual gallery page that the timeout occurred. This is because the first time an image is displayed on a page at a new size specified by the template file it creates that size image and saves to disk.

Yes, if the first call in a page (template) to display all images by a smaller size than the original sizes, all smaller images must be created first. And if you have e.g. 50 images and every image to resize take 1 second you will run into a timeout after 30 seconds per default.

If you want to avoid that you can use the set_time_limit($n) in the loop like I have explained in previous post. Thats one approach to not run into a script-timeout. Once the images are created the complete script will run within a second or less. But there is no drawback to use set_time_limit($n)!

Soma has shown another approach, to hook into ajax-upload and do all resizes then. This way all images are allready created / cached before a page is called first time.

But if you once change the dimension for the output, you have the same situation as described in the first post: all images are uploaded fine but you have to create 50+ images on the fly. (I suggest to use set_time_limit($n) in the foreach loop :) )

To limit max-dimensions in field-settings may be a solution in some cases, but not if you e.g. need 300px and 3000px in a site.

I'm wondering if a solution to the original problem might be for the upload routine to scan the template for any $image->size(x,y) calls and resize them as they upload? Since each upload using the AJAX uploader is a separate request to the server it shouldn't technically timeout then.

I'm very interested in a way to do a automated scan for all needed sizes.  :)  Here are some lines of code that may be in templates and call images. (to simplify it and to focus on the most common usage, I disregard image-calls from / to other pages / templates):

$imgs = $page->images;
foreach($imgs as $img) {
    $img = $img->width < $img->height ? $img->height(600) : $img->width(600);
    echo "<img src='$img->url' style='width:33%;height:auto' alt='$img->description' />";
}

// ...

$options = array(
	'upscaling' => false,
	'cropping' => false,
	'sharpening' => 'medium',
	'quality' => 84,
);
foreach($page->images as $image) { 
	echo "\t<li><a href='{$image->size(800,800,$options)->url}'><img src='$image->width(200)->url' width='$image->width' height='$image->height' alt='$image->description' /></a></li>\n";
}

// ...

$imgs = $page->images;
$options = array('quality'=>75,'sharpening'=>'soft','upscaling'=>true,'cropping'=>true);
foreach($imgs as $img) {
    $options = $img->width < $img->height ? array_merge($options,array('cropping'=>'north')) : $options;  // people portraits or landscape ?
    $img = $img->width(600,$options);
    echo "<img src='$img->url' style='width:33%;height:auto' alt='$img->description' />";
}

And once again - if you have successfully scanned and collected all wanted sizes, this only would work if you never change the dimensions in template after the first images are uploaded to the site. Also I think this is very close to what soma allready has posted and what the OP uses with success. He only has to input the sizes manually into the module.

And I think you allready guess what's my suggestion when you want to play save? Yes: call set_time_limit($n) in the loop! :)

  • Like 1
Link to comment
Share on other sites

@horst thanks very much, that explains things perfectly!

I recently set up a client site on an entry level hosting plan, and after moving the site from my dev server to the new account the frontend would alyways time out with an error 500.

After some back and forth with support it turns out it was maxing out the script limit with that initial call.

Now, with set_time_limit in the image loops, everything seems to be working fine so far!

Thank you, again, very useful tip!

cheers

Phil

  • Like 1
Link to comment
Share on other sites

??

What I meant was people seemed to be trying to fix a front-end issue with a back-end solution that wouldn't solve it with the original poster's code :)

Didn't mean your reply, rather earlier ones.

  • Like 1
Link to comment
Share on other sites

  • 1 year later...

This helped me with an install on Bluehost where I was getting a "Maximum execution time of 30 seconds exceeded" error from an image gallery (all of the images were small, around 800x800 pixels)

I first changed PHP 5.4 (Blank or default) to PHP 5.4 (FastCGI) and the error went away for up to 10-20 images, but more than that and it came back.

I added the line:

foreach ($page->images as $image) {
    set_time_limit(30); etc...}

This helped a little more but I was still getting the message when doing 30 images in one upload.

Finally from this post: https://processwire.com/talk/topic/4280-best-server-configuration-for-processwire/?p=42081 I decided to change the php.ini settings:

memory_limit to 256M

post_max_size = 100M

memory_limit = 256M

upload_max_filesize = 100M

max_file_uploads = 100

I didn't bother changing :  "max_execution_time = 30" since I figured that the above code would take care of that, 30 per file, instead of the whole script.

It seems to be working fine now, atleast with 30 - 50 images uploaded at once.

  • Like 1
Link to comment
Share on other sites

It took me a lot of tweaking to get my server setting right for bluehost, the problem you may end up with once you get your execution time sorted will be memory if you are doing piles of images (like me).  One thing that I found really helpful was a script that I wrote. This is a utility and the timeouts are maxed just for doing a really large run when I was getting ready to bring a server live. 

This script finds all the images on my site and pulls them into memory, if they don't exist then the server generates a cached derivative of it. 

ini_set('max_execution_time', 60*999); // 5 minutes, increase as needed

$pa = $pages->find("template=items,artwork_img.count>0,id>14000");
$pa = $pa->reverse();
$count = count($pa);

echo "<h1>Processed</h1>";
echo "-------------------------------------------<br/><br/>";

for ($i = 1; $i < $count; $i++) {
	set_time_limit(300);
	
    $item = $pa->pop();
	$filename = $item->artwork_img->filename;
	//$ImageSizer = new ImageSizer($filename);
    //$ImageSizer->resize(1116,0);

	if(count($item->artwork_img)){
		$item->artwork_img->width(900);	
		$item->artwork_img->width(1116);		
		$item->artwork_img->width(365);	
		$item->artwork_img->width(255);		
	}
	echo $filename."<br/>";

	$pages->uncache($item);
}

echo "<h1>Yet to be Processed</h1>";
echo "-------------------------------------------<br/><br/>";

foreach($pa as $p){
	echo $pa->filename.'<br/>';
}

I then decided I would just let field generate the image on upload, just in case a user added a huge pile and didn't give the server a chance to generate the derivatives. When this would happen the server would, of course, hang while processing. For that I just used soma's script, and never looked back. :)

<?php
class ImageCreateThumbs extends WireData implements Module {

    public static function getModuleInfo() {
        return array(
            'title' => 'ImageCreateThumbs',
            'version' => 100,
            'summary' => '',
            'href' => '',
            'singular' => true,
            'autoload' => true
            );
    }

    public function init() {
        $this->addHookAfter('InputfieldFile::fileAdded', $this, 'sizeImage');
    }

    public function sizeImage($event) {
        $inputfield = $event->object;
        if($inputfield->name != 'artwork_img') return; // we assume images field
        $image = $event->argumentsByName("pagefile");
		$image->width(900);	
		$image->width(1116);		
		$image->width(365);	
		$image->width(255);			
    }
}
?>
Link to comment
Share on other sites

Sorry, didn't mean to ruffle any feathers, never said I was the one who came up with the code. I just figured that clarifying by posting a modified version with different image sizes would make the script make more sense to the user along side the original script posted by soma. I would never intend taking credit for soma's genius :)

Link to comment
Share on other sites

  • 1 month later...

I also use somas script and actually tried to enhance it a little. However I'm stuck with line 23:

if($inputfield->name != 'images') return; // we assume images field

What if I have multiple file upload fields for images that should all use this script? Is there no way to do something like...

if(!($inputfield->type instanceof FieldtypeImage)) return; // not working!

...instead to not have to assume a certain field name?

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
 Share

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...