Jump to content

Autofahrn

Members
  • Posts

    328
  • Joined

  • Last visited

  • Days Won

    10

Everything posted by Autofahrn

  1. It's just manually enforcing the string is escaped in quotes, but single quotes this time. Since I guess that double quotes are removed somewhere, you may try with this one as well: ['title|name|indexer', '%=', '"'.$a_terms.'"' ] In both cases you really need to ensure the search term does not contain quotes itself, searching for "it's" will fail now. That's normally something $sanitizer->selectorValue() takes care of.
  2. Thanks for the pointer, searching for "&something" does not work with AjaxSearch either, since it sends the field unescaped through a get request, so "something" effectively gets a second parameter and the search string is empty (does not contain the ampersand). But that's another story. Could you try this line (or better run $a_terms through $sanitizer->selectorValue()): ['title|name|indexer', '%=', "'$a_terms'" ]
  3. Did you check for bad characters? Had this sometimes with copy&paste code from the forum. Modified the script a little to hopefully display a reasonable error message.
  4. Nice, always forget... ? Selector Upgrades
  5. neither me. Could you set $config->debug=true; in your site/config.php? Then there should be some more information in that error message.
  6. Seems like your $sanitizer->selectorValue() does not work at all for some reason. Could you check with Tracy? $search_term = '!'; $val = "template=contentpage,title|headline|body%={$sanitizer->selectorValue($search_term)}"; d($val); Should output: "template=contentpage,title|headline|body%="!"" (45)
  7. The search on that site can actually do both, depending if the search phrase starts with a ". No, it does not make a difference (just tried). Just logged my resulting $selector and saw the ! is encapsulated in ": 'text_index%="!", limit=16' Which version of PW are you running (I'm on 3.0.123)? Maybe there's a fix in $sanitizer->selectorValue(). (Edit: Ok, should read the last paragraph as well).
  8. @BillH, I'd probably go with a pretty different approach and manage a separate "RecentList" when updating entries instead for each view. If images are stored on individual pages, you simply maintain a repeater (on some hidden page) holding a page reference to each image. Whenever a new image is uploaded (i.e. use hook on page save), a link is added to that repeater. If it has too many items, the oldest are removed in that step. Since you are using multiple images per article, your repeater has to hold a reference to the article and the basename of the newly added image. On page view you simply load the repeater content (no search at all). To speed up things, you may add more fields to that repeater, like the title of the article, so there is no need for additional database queries. To ensure titles stay in sync, your page save hook needs to update the cached article titles as well.
  9. Does the error log tell you more about the 500 error? Just wondering since I basically do the same on the pwflex.team-tofahrn.de search: if($config->ajax && (($query = $input->get->q) != '')) { $query = $sanitizer->selectorValue($query); $search = "text_index~=$query"; $selector = "{$search}, limit=".($Limit+1); $matches = $pages->find($selector); Only difference may be, that my script requires three chars before it sends the query (makes no difference, just tried).
  10. @arjen quickly seeked the log and do not found anything wrong. In that log closing happens at 0.5s and opening at 2.7s so everything seems to work fine a the time between close and open indicates the time for zip operations to take place. You may give it a try and place the backups in some other location (i.e. next to the .../public folder or in .../public/backup), but I doubt this helps, since the logfiles are there. I can only guess that the $zip->close() fails for some reason, so please try changing that block to read: $zipLog->verbose("CLOSING ZIP: {$zipfile}"); $res = $zip->close(); if($res !== true) { $zipStatus = $zip->getStatusString(); $zipLog->verbose("CLOSE ZIP FAILED: {$zipStatus} in {$zipfile}"); throw new WireException("Unable to close ZIP ({$zipfile}): {$zipStatus}"); } $zipLog->verbose("OPENING ZIP: {$zipfile}"); Since I expect the close to fail I've read the complete log and saw, that it tries to zip temporary files generated by ProCache. So its probably worth to add an exclude for ProCache: %/ProCache-[^/]+$% # Ignore ProCache directory and maybe the pwpc directory as well: /site/assets/pwpc/ Edit: forgot that exclusion directories must be site-relative (so its /site/assets/pwpc/ not /pwpc/)
  11. ok, too early... ? ...at least we managed to make this a hot topic: ?
  12. For me the count() always returns the amount of pages found, regardless of its argument. But I'm on 3.0.123 still...
  13. that's clear. But to fulfill the initial request, $products should hold the full result from the find in case it contains at least three elements. So PageArray::count(int) is expected to be something conditional and returns the full array if the array contains the requested number or something empty.
  14. Is this mentioned somewhere that PageArray::count resp. WireArray::count may return something else than an int? Always learning...
  15. try: $products = $pages->find("template=product, limit=9, sort=sort" ); if($products->count() > 3) { } https://processwire.com/api/ref/wire-array/count/
  16. I'd probably do the find and then check if $products contain the minimum amount. Otherwise you'll have two database queries.
  17. Do you use ZipArchive to build the zipfile? The method addFile may take more than a single argument: https://www.php.net/manual/en/ziparchive.addfile.php The first is the full path to the real, uncompressed file on disk, the (optional) second parameter is the name inside the zipfile. I'd simply specify something unique for the second argument then. Edit: Ok, just realized you're referring to wireZipFile so probably not helpful.
  18. I've just re-read the previous posts and see that CLOSING ZIP actually happens after 0.0 seconds of operation. This is pretty unlikely since I expect some time for directory enumeration and packaging the first part. Can you send me that verbose log via PM?
  19. the if(false is my php replacement for #if 0 ?
  20. ok, "no such file". No surprise, that re-opening and the package build fail. So I wonder why creation of the zip does not already throw an exception. Do you see that file on your server?
  21. Well, I guess your site contains a lot of data so the re-opening still is triggered. I'm really interested in the returned error number, did you try to add the suggested code change? Otherwise you may simply disable the whole block in duplicator.module: if(false && ($fragmentBytes >= self::DUP_ZIP_FLUSH_MBYTES*1024*1024)) { $zipLog->verbose("CLOSING ZIP: {$zipfile}"); $zip->close(); $zipLog->verbose("OPENING ZIP: {$zipfile}"); // if($zip->open($zipfile) !== true) throw new WireException("Unable to re-open ZIP: $zipfile"); $res = $zip->open($zipfile); if($res !== true) throw new WireException("Unable to re-open ZIP ({$zipfile}): {$res}"); $zipLog->verbose("OPENED ZIP: {$zipfile}"); $fragmentBytes = 0; set_time_limit(300); } @flydev, maybe we should have DUP_ZIP_FLUSH_MBYTES configurable with that option disabled when not set.
  22. so same issue as @arjen observed, quite strange. @entschleunigung, did you verify the backups are complete? Do they have a reasonable size?
  23. Don't think so. The fix only retrieves the timestamp from filename differently, it does not look at the path. Edit: just saw: There normally should be a filename between "failed," and "doesn't". Did you specify something in the name field?
  24. Good point, I probably should edit the first post to contain the download links... ? Currently they hide in the second post from April, 12th.
  25. The second install is probably working on a smaller site (less than 200 MBytes of data), so the re-opening mechanism is not triggered at all. I guess you edited the pathnames, since the /**/ looks weird to me and I'm missing the domain name. Does your domain name contains a dash? Then this fix should help:
×
×
  • Create New...