BFD Calendar Posted October 14, 2015 Share Posted October 14, 2015 Since a few days without any system changes, updates or whatever Processwire is running very slow on my localhost:8888, it takes almost a minute for any page to load, including admin pages. I'm using MampPro on Mac OSX 10.10. Other (html) sites run fine and I'm just working on one PW site. Anybody has any idea where to look? Link to comment Share on other sites More sharing options...
FrancisChung Posted October 14, 2015 Share Posted October 14, 2015 Have you checked your Apache, MySQL, PHP logs? Link to comment Share on other sites More sharing options...
cstevensjr Posted October 14, 2015 Share Posted October 14, 2015 Please look at the following post for possible help https://processwire.com/talk/topic/8805-slow-admin-localhost/ Link to comment Share on other sites More sharing options...
BFD Calendar Posted October 14, 2015 Author Share Posted October 14, 2015 @ cstevensjr, yes I checked that post and searched the forums but found nothing really helpful for Mac localhost. @FrancisChung, will do that tomorrow and maybe post when there's anything remarkable to be found. Thanks. Link to comment Share on other sites More sharing options...
BFD Calendar Posted October 15, 2015 Author Share Posted October 15, 2015 @FrancisChung, checked but I don't see anything special or different in the last two weeks, which is when the slowness started. Plain html pages load fast so I suppose either MySQL or PHP are the culprit. Link to comment Share on other sites More sharing options...
cstevensjr Posted October 15, 2015 Share Posted October 15, 2015 It would be very helpful to the troubleshooting effort if we knew what versions you are running (PW, PHP, MySQL, etc....). Any other specific information regarding your installation would also be beneficial. Thanks. Link to comment Share on other sites More sharing options...
BFD Calendar Posted October 15, 2015 Author Share Posted October 15, 2015 PW 2.6.18 - PHP 5.6.7 cache module off - Apache port 8888 - MySQL 5.5.42 port 8890 - using MAMP Pro 3.2.1 - MacBook Pro OSX 10.10 or Apache/2.2.29 (Unix) mod_wsgi/3.4 Python/2.7.8 PHP/5.6.7 mod_ssl/2.2.29 OpenSSL/0.9.8zg DAV/2 mod_fastcgi/2.4.6 mod_perl/2.0.8 Perl/v5.20.0 configured Link to comment Share on other sites More sharing options...
cstevensjr Posted October 15, 2015 Share Posted October 15, 2015 (edited) Please review this Stackoverflow thread http://stackoverflow.com/questions/18537887/mamp-localhost-resolving-very-slowly Also, please look at this ProcessWire Forum thread https://processwire.com/talk/topic/8805-slow-admin-localhost/ Edited October 15, 2015 by cstevensjr Added ProcessWire Thread on slowness Link to comment Share on other sites More sharing options...
FrancisChung Posted October 15, 2015 Share Posted October 15, 2015 Have you got XDebug turned on? If so, try turning it off? Link to comment Share on other sites More sharing options...
BFD Calendar Posted October 15, 2015 Author Share Posted October 15, 2015 Well I narrowed everything down to ProcessWire itself and found out what was happening.... I had been importing a lot of pages from .csv files that needed to be available in all languages. For that somebody here advised me a ready.php file with a 'set all languages active' script. That worked fine, but now it turned out it was making ProcessWire more sluggish by the day. Commenting everything in the ready.php file made ProcessWire jump again like a young horse. Link to comment Share on other sites More sharing options...
Pete Posted October 15, 2015 Share Posted October 15, 2015 What was the code in your ready.php file? Would help to narrow down the issue further as technically anything could go in ready.php Link to comment Share on other sites More sharing options...
SiNNuT Posted October 15, 2015 Share Posted October 15, 2015 If you have some variation of https://processwire-recipes.com/recipes/activate-all-languages/ running in ready.php, i could certainly see how this could slow things down as the content grows. It's also not meant to be used that way, but maybe i've misunderstood. Link to comment Share on other sites More sharing options...
BFD Calendar Posted October 16, 2015 Author Share Posted October 16, 2015 That was indeed what I had: // SET ALL LANGUAGES ACTIVE SCRIPT $pages->setOutputFormatting(false); $pag = $pages->find("template='students'"); foreach($pag as $p) { foreach($languages as $lang) { if($lang->isDefault()) continue; $p->set("status$lang", 1); $p->save(); } } Link to comment Share on other sites More sharing options...
Craig Posted October 16, 2015 Share Posted October 16, 2015 Was that code being loaded on every page request? Link to comment Share on other sites More sharing options...
SiNNuT Posted October 16, 2015 Share Posted October 16, 2015 Was that code being loaded on every page request? Probably, otherwise there would be no reason for the code to slow things down. It should be noted that this code should only be run after one has created a bunch of multilang pages from the API. Link to comment Share on other sites More sharing options...
BFD Calendar Posted October 16, 2015 Author Share Posted October 16, 2015 Or if you imported a lot of pages from a .csv file. I imported over 500 'students' - in several batches - that needed to be available in two languages. When importing from .csv pages are only published in the default language, which is why I used the code. I keep the 'ready.php' with code commented so I can run it after importing some more pages from .csv. Apparently the page can be used for other purposes as well but I'll leave that to the experienced users here. Link to comment Share on other sites More sharing options...
Pete Posted October 18, 2015 Share Posted October 18, 2015 Changed the best answer because Sinnut is right, scripts that are to alter things after a big import routine should only be run once, not repeatedly. Also, it might be worth having another field called something like "languages_done" so if you do your import of new pages more than once you can store a 1 in that field and skip over the pages that are already done and only do the new pages otherwise you're changing and saving the existing pages too. 1 Link to comment Share on other sites More sharing options...
BFD Calendar Posted October 18, 2015 Author Share Posted October 18, 2015 Wouldn't it be better – in my case – to have the script search for pages where template=student and languages are not active and then set them to active? Otherwise I have to add the 'languages_done' and store a 1 in all the pages that are already there. And maybe if it doesn't find any it wouldn't slow down anything either, except of course the time of running a useless script. Link to comment Share on other sites More sharing options...
Pete Posted October 18, 2015 Share Posted October 18, 2015 Your current code wouldn't be massively scalable as you're iterating every single page regardless of whether it has been previously processed or not. Your additional check you just mentioned - yes, you could (and probably should) do that with a longer selector, but I'm not sure what it's actually doing (is there a field for every language? Not sure what status$lang is doing). The extra field with the 1 in it after saving a page you've processed would mean the loop skips loading those pages that have already been processed next time you run it. I suggested the field with the 1 in purely as a quick example. The extra field wouldn't add much overhead as it's only used during this process (fields aren't pulled from the database when viewing a page unless you use them somewhere in your templates) but yes, there are other ways to do this too. If it is only to be run after every time you import large numbers of pages, you could also just jot down the last page ID and put ", id>14950" for example at the end of your selector. Lots of options 1 Link to comment Share on other sites More sharing options...
BFD Calendar Posted October 19, 2015 Author Share Posted October 19, 2015 From a few tests I did the option ", id>14950" was definitely the best. As soon as the script has to search through over 2000 pages it bounces on several '30 seconds' errors. And it's definitely better to break down .csv imports to about 200 and run the script before it chokes. Link to comment Share on other sites More sharing options...
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now