Jump to content

Search the Community

Showing results for tags 'slow'.

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


Forums

  • Welcome to ProcessWire
    • News & Announcements
    • Showcase
    • Wishlist & Roadmap
  • Community Support
    • Getting Started
    • Tutorials
    • FAQs
    • General Support
    • API & Templates
    • Modules/Plugins
    • Themes and Profiles
    • Multi-Language Support
    • Security
    • Jobs
  • Off Topic
    • Pub
    • Dev Talk

Product Groups

  • Form Builder
  • ProFields
  • ProCache
  • ProMailer
  • Login Register Pro
  • ProDrafts
  • ListerPro
  • ProDevTools
  • Likes
  • Custom Development

Find results in...

Find results that contain...


Date Created

  • Start

    End


Last Updated

  • Start

    End


Filter by number of...

Joined

  • Start

    End


Group


AIM


MSN


Website URL


ICQ


Yahoo


Jabber


Skype


Location


Interests

Found 8 results

  1. Hello fellow PW devs! This is a short story from the server management trenches. These past couple of days trying to solve an unexpected problem: after DigitalOcean patched the droplets in NYC3 region last week, my client's droplet became almost useless and went down a couple of times. The droplet has 2GB RAM and was running Ubuntu 16.04 that was updated to kernel 4.4.0-116 after the patch. The server was provisioned using Forge (forge.laravel.com). After sshing into it, and running "top" I've noticed the cause: "php-fpm7.1" processes (3-5 instances) were spiking the CPU to 100%. This was very odd, as the CPU usually kept around 33% most of the time. The site uses ProCache and markupCache and was getting around 800-1000 visits/day last week. I checked everything on PW's side and nothing seemed out of place, so I went restarting PHP and Nginx but the problem continued. I checked access logs and no suspicious activity shown up. I upgraded PHP to 7.2 to see if anything will changed but the problem continued. My only guess after all that is that the droplet in question got screwed up somehow, because I didn't see any complaints on the web of other people getting the same problem on DO (But I confess that I did a quick Google search only). So in the end I decided to create a new droplet, now with 2 CPU cores and kept the 2GB (1 extra core and $5 cheaper ). Reinstalled PW there and pointed the floating IP to this new server. The installation went smooth but to one issue: error log started to show messages of MySQL showing "to many files" error when the users were searching. I've never encountered this message before, so after reading some StackOverflow posts, I changed mysql.services config file to remove its file limit (https://stackoverflow.com/a/36807137) Everything is normal now, but I think I'll never discover what truly happened. Anyone else had this kind of problem with MySQL before?
  2. I'm building a site and i noticed that loading time is very slow when we enter the site address for the first time. It takes about 1 minute or more just to start loading files and displaying the content. However, after that period it's very fast and everything is ok. Once the page(s) is/are loaded, i can refresh them quickly and without any delay. any one faced like this problem? and what can i do to resolve it. more details: ProcessWire 3.0.80 hosted type: VPS this message appear in admin dashboard: " Warning: your server locale is undefined and may cause issues. Please add this to /site/config.php file (adjust “en_US.UTF-8” as needed): setlocale(LC_ALL,'en_US.UTF-8');" The site: http://almanassah.net
  3. I'm displaying a list of products which are found by their templates, but the pages are taking a very long time to load. At first, I blamed it on my image rendering (using PIM2), but even with all those images now stored in the file tree, the page is taking abysmally long to load. ProCache seems to help but I don't feel as though what I'm trying to do should be gnawing the bones of my resources quite so long. The variable for the selector is defined in my header include: $productCatList="prod_series|prod_series_ethernet|prod_series_access|prod_series_accessories|prod_series_fiber|prod_series_pwr_supplies|prod_series_pwr_systems|prod_series_wireless"; $getCurrentProdOptions="template=$productCatList, prod_status_pages!=1554|1559|1560|4242"; Then in the template for the page upon which the directory loads: $products = $pages->find("$getCurrentProdOptions"); include_once("./prod-list-row.inc"); echo $out; And the prod-list-row.inc foreach (which is on every page that's exhibiting the slowdown): <?php $sum = 0; $out =""; $out .= "<div class='span_12_of_12'>\n"; foreach($products as $p){ $sum += 1; if ($sum % 2 == 0) { $bgcolor = '#fff'; } else { $bgcolor = '#e4e4e4';} $par = $p->parent; $out .="<div class='section group' style='background: $bgcolor ; min-height: 110px'>\n"; $img = $p->prod_image; $thumb = $img->pim2Load('squarethumb100')->canvas(100,100,array(0,0,0,0),'north',0)->pimSave()->url; $out .="<div data-match-height='{$p->title}' class='col span_2_of_12 hide'>"; $out .="<a href='{$p->url}'><span class='product-image-box'><img src='{$thumb}' alt='{$p->title}' title='{$p->title}'></span></a>"; $out .= "</div>"; $out .= "<div data-match-height='{$p->title}' class='col span_6_of_12'>"; $out .= "<div class='prod-list-name-label'><a href='{$p->url}'>{$p->title}</a></div>"; if($page!=$par) { $out .= "<div class='prod-list-category-label' style='font-size: .7em;'>Category: <a href='{$par->url}'>{$par->title}</a></div>"; } $out .= "<div class='list-headline' style='font-size: .8em;'>{$p->headline}</div>"; $out .="<div class='learn-more-buttons-sm'>"; $out .="<a href='{$p->url}' title='Product Specs and Documentation'><span class='find-out-more-button' style='font-size: .8em;'><i style='font-size: .8em;' class='fa fa-lightbulb-o' ></i> &nbsp; Learn More</span></a>"; $out .="</div>"; $out .="</div> \n"; $out .= " <div data-match-height='{$p->title}' class='col span_4_of_12'>"; if(count($p->prod_feat_imgs) >0 ){ $out .= "<div class='featured-icons-list' margin: 2em .5em;'>"; foreach($p->prod_feat_imgs as $feat){ $icon = $pages->get("$feat->prod_featicon_pages"); if($icon->image) { if($feat->prod_feat_textlang) { $icontitle = $feat->prod_feat_textlang;} else {$icontitle = $icon->title;} $out .= "<img src='".$icon->image->size(35,35,$imgOptions)->url . "' alt='" . $icontitle . "' title='" . $icontitle . "' class='listing-feat-icon' style='margin-right: .5em;' />"; } } $out .= "</div>"; if($p->prod_product_line){ foreach($p->prod_product_line as $pline) if($pline->image) { $out .= "<div style='height: 35px;'>\n"; $out .= "<img src='{$pline->image->size(75,35,$imgOptions)->url}' alt='{$pline->title}' />"; $out .= "</div>"; } } } $out .= "</div>"; $out .="</div>"; } $out .= "</div>"; Is there a clear culprit here of what I'm doing that's so stressing the system? I turned off TracyDebugger because I saw another thread about that causing slowdown (even though I'm using the latest), but that had no effect. Every time I thought I found the culprit and commented it out, nothing changed. Would appreciate some more eyes on this. Thank you! ETA: prod_feat_imgs is a repeater field which contains a Page reference field (from which I pull the image and title) and a multilanguage textfield (to override the page reference title if it exists). Could that be the problem?
  4. I have a website with a slow page load mainly due to a slow query on a listing. I think there might be a better way to query the data / arrange the data back end which is what's causing the query to be slow. So the data is like this Area -> Level -> Path, the path then links to a pool of units included within that path. The units then have study locations listed as child pages for that unit with contact information etc. I have a page where i list all the study locations, but because i'm going through every unit and then every child study location page it takes quite a while. I have over 200 units with around 5 locations as sub pages. Any way i can reorganise the data to make this listing faster to load?
  5. hey there, i have a collection (parent page) of persons (child pages of 'collection'). each person has several fields. two of them are repeater fields where the person can enter their 'jobs' and 'recidencies'. i'm trying to build a list of entries which looks like: Actor Peter Maria Paul … Doctor Eva Julia William … for the first 5 persons everything worked smoothly. but now that i've reached about 20 entries the server slows down and i'm wondering if my loop is somehow cluttred up. <?php $langname = $user->language->title; //get current user language?> <?php $persons = $pages->get('/collection')->children->filter("lang=$langname") // get all children for current user language ?> <section class="profession"> <h1>professions</h1> <?php foreach ($persons as $child): ?> <?php foreach ($child->professions as $profession): // the repeaterfield is called: professions, the field itself is profession?> <?php $profAll[] = $profession->profession // store all entries; $profUnique = array_unique($profAll) // only unique entries ; sort($profUnique) // sort the entries; ?> <?php endforeach; ?> <?php endforeach; ?> <ul style="column-count: 2;"> <?php foreach ($profUnique as $profLetter): // loop through all professions ?> <li style="font-size: 2rem; list-style-type: none;" class="letter"><?= $profLetter // output one profession e.g. Actor?></li> <?php foreach ($persons->find("professions.profession=$profLetter")->sort('givenname') as $person): // find all persons who have the profession Actor ?> <li><a class="ajax" href="<?= $person->url ?>"><?= $person->givenname // output the name of person who fits the profession ?></a></li> <?php endforeach; ?> <?php endforeach; ?> </ul> </section> is there a way to make this request faster? (i'll have at least two of them on the same page)
  6. Hello, I've searched high and low and this came to a point where there's not much I can do. I'm simply unable to view the site, now. So far: 1. index.html - HTML file is fast, so it's not an apache problem. 2. text.php - A simple php showing php info is fast, so it's not a PHP problem. The site has been becoming slower and slower over time. I installed xdebug and I get the results you can see in the attached images. I'm concerned because this is a problem that is getting worse, and worse, and worse. To the point to which is impossible to keep working in the website. Any ideas of what I might do to improve this situation. It's getting real bad in the site and also on the backend. Thanks...
  7. Hello, Getting random pages with $oldPosts = wire('pages')->find('template=post, shown=1, limit=100')->findRandom(16); works fairly good. About 500ms But without the limit=100 it takes more than 6 seconds. There are only 60 pages with template post but it will be thousands when the app is live. Is there a more efficient way of getting random pages?
  8. Often I use the internal domain names ( in hosts file ) with the extension .local. Sites developing with that extension responds as it should in safari (mac). On Google Chrome however, the respond is very slow. It looks like Google wants to collect as much data posible and before the request is send to apache the data is send to mighty Google. Changing the extension from .local to .loc (in host file ) solves the lagging for me. Hopefully this post is helpful for people experience the same issue.
×
×
  • Create New...