DrQuincy Posted August 29 Share Posted August 29 I have something like this: $pages = wire('pages')->find('template=foo'); // $pages->count = int(330) foreach ($pages as $page) { // Here I get 23 properites $foo1 = $page->property1; $foo2 = $page->property2; // ... $foo22 = $page->property22; $foo23 = $page->property23; } If I only get 1 property is takes about 0.02s. But if I get all 23 it goes up to ~5s on a decent Linux server. The number of properties retreived proportionately increases the execution time so there isn't any one property that is a bottleneck. Is this to be expected? 330 pages is not exactly a lot. I know the API has a $cache variable but the above pages are linked to a search form so it could prove tricky (although I could cache the initial form where no filters are applied). Is there anything I can do to increase the speed? I guess I could store all properties as JSON in an additional field and just retreive that. It just seems unreasonably slow to deal with that kind of execution time on a few hundred records. Thanks. EDIT: Just seen that the Autojoin option might do the trick. Link to comment Share on other sites More sharing options...
DrQuincy Posted August 29 Author Share Posted August 29 I have tried Autojoin and it is approx. 5 x faster so that's good. Link to comment Share on other sites More sharing options...
BrendonKoz Posted August 29 Share Posted August 29 Hard to know exactly what to suggest as many different scenarios could affect responsiveness. (Re-?)Rendering images could easily slow things down, as could page reference properties that aren't loaded until called/necessary. Depending on your setup, using findRaw() or findMany() might have a positive impact, as would your autojoin; if you think an autojoin might only be necessary for your one scenario, you could look into findJoin() and whether it accomplishes the same end-result on that page. For a ProcessWire (Pro) module assist, there's always ProfilerPro, as well. 5 Link to comment Share on other sites More sharing options...
da² Posted August 29 Share Posted August 29 (edited) When I need to load a fair amount of data I do custom MySQL queries. findRaw() is very fast too but I don't like to parse its results. Just be careful to sanitize data where needed (all texts at least). The downside is that it takes much more development time. But I become better in MySQL stuff. 😁 Edited August 29 by da² 2 Link to comment Share on other sites More sharing options...
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now