Jump to content

site terrible serverside performance


benbyf
 Share

Recommended Posts

Hi, Got a site with a mystery up and down performace, sometimes it takes a couple of seconds to return teh page anmd sometimes a minute!!! Just wondered if there was strategies to work out where the work is being done / what hte processes are thats causing it?

Link to comment
Share on other sites

This is a bit vague, but Tracy can sometimes help identify slow requests. You could also look into browser dev tools (just in case it's some specific resource that's slowing queries down), slow database queries (MySQL slow query log if it's available, though PW debug mode tools can also provide plenty of insight into these if you happen to run into a slow query while browsing the site), and perhaps even Apache logs to identify which specific requests are taking a long time.

If you have access to ProDevTools package, I can highly recommend Profiler Pro. We recently used this to debug a module that was causing issues due to hooks, and for that purpose it was brilliant.

On the other hand: if said slowdown is indeed quite random, one potential culprit would be LazyCron. I would likely start by checking if any local code snippet or module is triggering a slow task via Lazy Cron. I've ran into that many, many times.

  • Like 4
Link to comment
Share on other sites

Thanks @teppo Yeah in dev tools its slow the request itself is the issue each time (but not slow all the time). So will defos look at my lazycron's and maybe the Profiler pro if it can look at specific module processes and queries then that would be awesome!

Link to comment
Share on other sites

Think I've worked it out, every hour I run a cron job to check all pages to see if they need to be published (I have a publish later thing going on), which wasn't an issue years ago but now there is like 3k pages ?

Link to comment
Share on other sites

59 minutes ago, benbyf said:

Think I've worked it out, every hour I run a cron job to check all pages to see if they need to be published (I have a publish later thing going on), which wasn't an issue years ago but now there is like 3k pages ?

Seems feasible.

Depending on how it's done, just switching $pages->find() to $pages->findMany() can help a lot (assuming it's a relatively new PW). Recently worked on a site with ~700k pages, and making this change to SchedulePages helped speed things up considerably.

  • Like 1
Link to comment
Share on other sites

(if it is indeed it is the thing slowing things down), think my issue was i was using

$pages->find("{myField}=1, include=hidden")

which i've now changed to

$pages->find("{myField}=1, status=hidden")

I'm hoping this helps. I've also put the frequency down on the cron job... As although there maybe thousands of pages they should only really be a couple of hidden ones so need to find those as fast as possible ?

Link to comment
Share on other sites

From documentation: "…but findRaw() is more useful for cases where you want to retrieve specific things without having to load the entire page (or its data)". It is a reasoning that findRaw() might be faster in this case, since it won't wait when the page will be loaded.

Link to comment
Share on other sites

On 10/8/2021 at 1:12 PM, benbyf said:
$pages->find("{myField}=1, status=hidden")

I'm hoping this helps. I've also put the frequency down on the cron job... As although there maybe thousands of pages they should only really be a couple of hidden ones so need to find those as fast as possible ?

So are you looking for hidden pages or unpublished pages? Because for pages that have never been published, this may be faster:

$pages->find('published=, include=all');

The selector status=unpublished will turn into SQL as where pages.status & 2048. Now I don’t know what optimizations MySQL can do there, but it I suppose it will still have to look at a bunch of values and see if they match, whereas published=, include=all turns into pages.published IS NULL, so it should be a matter of returning a continuous range of rows from that index. Even better if you only want the ID, then it should never even touch the actual table.

Link to comment
Share on other sites

16 minutes ago, Jan Romero said:

So are you looking for hidden pages or unpublished pages? Because for pages that have never been published, this may be faster:

$pages->find('published=, include=all');

The selector status=unpublished will turn into SQL as where pages.status & 2048. Now I don’t know what optimizations MySQL can do there, but it I suppose it will still have to look at a bunch of values and see if they match, whereas published=, include=all turns into pages.published IS NULL, so it should be a matter of returning a continuous range of rows from that index. Even better if you only want the ID, then it should never even touch the actual table.

I'm looking for published pages that are hidden, but I would love to hear if there is a faster query to return those pages than $pages->find("{myField}=1, status=hidden") for sure!

Link to comment
Share on other sites

Try adding status>=hidden to your selector. I haven’t tested this with many pages, nor with an additional field selector, but it gives a much better index utilization than the query without. 17 examined rows vs. 2000 (this is on MariaDB 10.5.11).

Clearly only values over 1024 (hidden) can contain the 1024-bit, so the result set should be the same.

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
 Share

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...