Jump to content

$image->Width() function timeout


MuchDev
 Share

Recommended Posts

Hey there, so I figured someone would know how to solve this simply. I have seen some posts that get close to what I am working towards, but instead of bumping old solved posts I figured I would open a fresh one. 

I have thousands of images that I would like to pre-cache the derrivitave images for. Currently I think I am running into memory issues with the image resizer as the images are having to cache every time I load a page for the first time. A lot of the time I am getting this error:

Error: Maximum execution time of 30 seconds exceeded (line 1393 of /home/swflemin/public_html/dg/wire/core/ImageSizer.php) 
This error message was shown because you are logged in as a Superuser. Error has been logged. Administrator has been notified.

Is there any way to run an update on a image field so that these images get cached ahead of time? Or should I just use a site spider to force it to cache the images? Maybe this would be a good mod to add to an image field like alternate resolutions so that the server doesnt have to try and load them jit.

		if ($gridSizer == "1"){
		$gen .='<div class="cards col-lg-3 col-md-4 col-sm-6 col-xs-12">';
		$width = 400;
		}elseif($gridSizer == "2"){           
		$gen .='<div class="cards col-lg-6 col-xs-12">';
		$width = 550;
		}elseif ($gridSizer == "3"){          
		$gen .='<div class="cards col-lg-9 col-xs-12">';	
		$width = 825;
		}elseif ($gridSizer == "4"){
		$gen .='<div class="cards col-xs-12">';	
		$width = 1105;
		}else{
		$gen .='<div class="cards col-lg-3 col-md-4 col-sm-6 col-xs-12">';
		$width = 400;
		}

                $thisItem->artwork_img->width($width)->url;

Here is my logic, most images will be resized from around 1200 to 400. There is a selector on each item that lets the user decide the image size. 

Link to comment
Share on other sites

Hi MuchDev,

by default we prefer to bump up older posts then to create new ones. This way things that are close together are also stored close together and that makes it easier for followers (for us all). But no problem here. :)

And, - if you want that people has a good chance to help you with your problem, you should provide as many useful information as possible. :)

(focus is on useful == quality over quantity).

Ah, found a post that is titled: Lots of images to resize, timeout in frontend

I can ask you mainly the same things here:

"Maximum execution time exeeded" doesn't say much. I think it has to do more with your server than with the Imagesizer because I know many sites with many thousand images where it doesn't run into this. Also mormally it doesn't matter if you have a million images on a site but how many images are belonging to a single page. I have running sites with single pages that have 300+ images that needs to be rendered into 4 derivatives each at first page view (1200 derivatives in less then 30 secondes). What are the most and/or the average number of images per page in your site? How many derivatives do you need?

How big / large are your original images in MB and in pixels (maybe you have some with to many MegaBytes laying around?). You can set a maximum-width and -height in the image fields to downsize them directly on upload to for example 800px or whatever you prefer.

Does the server have enough memory to work with? How many? Recommended is 128M or better 256M for large image sites. Are you on a shared host? Is PHP running as apache module or as cgi?

If you read that other thread you will find a solution to render your derivatives directly on upload.

:)

  • Like 4
Link to comment
Share on other sites

Thank you for your post, this is really helpful information, I did see that post before but missed what you guys were talking about when I read it initially. Good to know about the etiquette, I always feel like I’m doing it wrong J.

Sorry for the limited info also I supplied all I could see out of that very vague error that the server gave me. I was actually surprised how vague the error was, I’m used to seeing errors that are considerably more detailed and not in red :). After staring at the error and reading your previous posts on the other page it is now a bit more obvious that my problem is due to a timeout issue with the server running into its 30 sec time limit for processing a page the first time.  My images are limited by their current web guy to 1200px wide proportionately, therefor the largest image that we have so far is around 900kb-1mb. Most pages have a 30 item limit in the get and are then paginated.

The server itself is a vps through bluehost, so the specs are :

Dual Core Opteron  6376 (2.2hgz)

512k cache

Memory: 1901912k/2097140k available (5330k kernel code, 408k absent, 194820k reserved, 7009k data, 1280k init)

Centos 6.5

As far as the deeper settings I must first admit that this is my first foray into linux server administration, but I did some digging and my php handler is set to fcgi. Everything came pre-installed and configured when I set the server up initially. Is this something that I could view through WHM?

That is really a cool tweak for the image module I didn't understand that the function was forcing the creation of derivative images, but as this is paid hosting I feel wary of increasing the image footprint and have to force the customer to pay for extra storage, I foresee this site ending up with around 30-40,000 items / images + derivatives.

All in all, it looks like I may need you to help me to help you so you can help me :D.

  • Like 1
Link to comment
Share on other sites

only 30 images of max 1 MB or max 1200px isn't much. I have 300 images at 800px per page on one site that works fast and fine.

The only thing that rings alert here is the fcgi mode. I don't have enough experience with server adminstration, (in fact I don't have any :) ) but I have seen that servers configured as fcgi can have disadvantages that results in slow execution. (has to do something with that system threads need to wait for their rights to start, they are somehow queued and delayed)

So, I believe an apache with fcgi could be tweaked to also run fast, but I don't know how. In the opposite I never have seen those problems when php is running as apache_module.

  • Like 1
Link to comment
Share on other sites

This is interesting, I would really like to see if there were some tweaks I could apply to my set up to get things processing faster. I just went digging again through cPanel and whm and there are a multitude of settings panels I could tweak with the propper documentation. According to some forum posts it looks as if my install is already a tweaked version of centos so I will need to approach any modifications with this knowlege in hand. Ahh the joys of learning, it never gets old :).

Link to comment
Share on other sites

Alright, I'm going to try running cgi as the handler for a while and see how well it performs. If there are no issues I suppose I could just use this. I found some more info on the forum on the subject and I see now that there are quite a few people who have had issues with their hosting and fcgi.

https://processwire.com/talk/topic/4524-fresh-install-mod-rewrite-and-apache-as-fastcgi/

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
 Share

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...