Jump to content

horst

PW-Moderators
  • Posts

    3,964
  • Joined

  • Last visited

  • Days Won

    85

horst last won the day on September 16 2021

horst had the most liked content!

About horst

Contact Methods

  • Website URL
    https://nogajski.de/

Profile Information

  • Gender
    Male
  • Location
    Aachen, Germany
  • Interests
    Photography; PHP, HTML, CSS, SASS;

Recent Profile Visitors

35,461 profile views

horst's Achievements

Hero Member

Hero Member (6/6)

5.7k

Reputation

53

Community Answers

  1. FIELDS first! TEMPLATES second! CONTENT last! Even if you want add two new templates at once with a family relation like parent <>child, you paste in the JSON on the import site, it tells you about the conflict after inspecting. The solution is simply to run the import twice (two times). That is already supported by the importer.
  2. You definetely should! 🙂 Have a look at the Screenshots and its 100% save to:
  3. Not tested, but Whats about the php function trim()? I would trim <p> and </p> from your fieldcontent. (sorry, onmobile)
  4. https://www.google.com/search?q=site%3Aprocesswire.com%2Ftalk+RockSEO doesn't show much, but Rock(XYZ) indicates that it is a module from @bernhard (Bernhard Baumrock) 🙂 EDIT: Oh, @Guy Incognito has beaten me! 🙂
  5. +1 +1 Could be a good first (or next) step towards automated file based migrations with rollback feature! 🙂
  6. Wich PW version is this site running on? Or was it running on 10.01.2022?
  7. @donatas Hi, can you share your code for line 442 ?
  8. How do you process the input? $sanitizer in use? If so, wich sanitizer method? You can debug the input directly before any processing with the raw $_POST or $_GET super globals of PHP to see if there arives more then the 246 chars. (And then go further step by step)
  9. ATM I simply use the manual export <> import function from PW for fields and template settings. Every time I also have to alter field and or template settings, I write down the timestamp on a pad of paper when I start to do so and then collect a list (on the left side) of all fields I touches in the next time until deployment, (and on the right side I collect all template names).* When I'm ready for deploying, I may or may not shut down the live system for a short amount of time, depending on the amount of changes, then collect from the DEVs export selector all needed fields, copy the export json and drop it into the targets fields import field and proceed. Next step is the same with template changes. There is no need to shutdown a live site for long time. Mostly when I had large deployments, it was closed for one hour, but most of the time I do and have done those db migrations without any maintenance mode, because most of my sites uses ProCache. If it really would matter, I can prime the cache by crawling the site once before starting to deploy. It is not comfortable and also not the best I can think of, but it's not nearly as uncomfortable as described by @bernhard😉 Using these functions is not nearly as error prone as manually writing EVERYTHING into one schema or another for every field and template change. Noting everything by hand is not error tolerant for typos, I might forget to note a value, a change to a template, etc. All this is avoided by simply copy-&-paste ready generated JSON directly from PW. No typos, nothing forgotten, nothing added. Not perfect but for me much faster, easier and safer than writing everything down again by hand. PS: * In case you are not sure if you have written everything down on paper, you can generously include additional fields or templates in the export. In this case, it is better to have more (or even all) than too few. All unchanged settings will be recognized by PW during import and will not be processed further. 🙂
  10. Hi @Didjee, unfortunately you are right. My old code no longer works. 😞 I have dived into the issue and filed a github issue with a possible fix. So hopefully Ryan soon will find time to look at it. 🙂 If you like, as a workaround you can place my new suggestion into the file wire/core/Pageimage.php instead of using the current webp() function there. /** * Get WebP "extra" version of this Pageimage * * @return PagefileExtra * @since 3.0.132 * */ public function webp() { $webp = $this->extras('webp'); if(!$webp) { $webp = new PagefileExtra($this, 'webp'); $webp->setArray($this->wire('config')->webpOptions); $this->extras('webp', $webp); $webp->addHookAfter('create', $this, 'hookWebpCreate'); } // recognize changes of JPEG / PNG variation via timestamp comparison if($webp && is_readable($webp->filename()) && filemtime($webp->filename()) < filemtime($this->filename) ) { $webp->create(); } return $webp; }
  11. Hey @benbyf, I believe I'm not of much help. 😉 Me seems to be just another dinosaurs setup. 🤣 IDE: Nusphere PhpEd PW Modules always: Admin DEV Mode Colors | ALIF - Admin Links In Frontend | Cronjob Database Backup | Process Database Backups | Login Notifier | User Activity | ProCache I develop on local https hosts (via Laragon on Windows); Where possible, I use Staging on subdomains of the (clients) LIVE server, as this mostly is the exact same environment setup like on LIVE; Sync & migration between all installations with Beyond Compare and the PW Database Backup Module, and/or the remote-files-on-demand hook from Ryan; Only very rare I directly work on STAGE via PhpED (SSH, SFTP), and afterwards pull it down to local; That's it. 🙂
  12. @DrQuincy what is this page doing that takes a lot of time? If there is a loop operation that iterates over a lot of items, and you are able to implement an optional get param with the last processed item identifier, there is a good chance that this is all you need to do to make this work. For this to work you would check if a start-identifier was given with the URL, if not start with the first item, otherwise with the #n item from the $input->get->myidentifiername. When starting the script, store a timestamp and in each loop operation compare it with the current timestamp. Maybe when it reached the point greater or equal 100 seconds, do a redirect call to the same page, but with the current (last) processed loop item identifier: ... if(time() >= $starttimestamp + 100) { $session->redirect($page->url . '?itemidentifiername=' . $lastProcessedItemID); } ... Or, much easier, have you tested if the 120 second time limit can be reset from within a script? (calling set_time_limit(120) from within a loop can make it run endless on some server configurations)
  13. @Didjee I think this must be an issue with the core. But I'm not able to investigate further ATM. Maybe there is a possible workaround, or at least some (micro) optimization, with your code example, leaving out the subfile class(es) from the chain. If you have enabled the $config->webpOptions['useSrcExt'], you can globally enable the creation of webP variations in the base core image engines, directly when creating any jpeg or png variation, by setting $config->imageSizerOptions['webpAdd'] to true. But this can result into a lot of unwanted webP files from intermediate image variations (that are never shown to the front end, but only are variation steps in the building chain)! Therefore better globally leave this setting to false, but pass an individual options array into any image sizer method where appropriate (width(123, $options), height(123, $options), size(123, 456, $options)) with this enabled. Example: <?php // this create a jpeg AND a webp varition with 900px for the 2x parts $imageX2 = $page->image->getCrop('default')->width(900, ['webpAdd' => true]); // this reduces the 2x variation to the half size for 1x, (jpeg AND webp at once) $imageX1 = $imageX2->width(450, ['webpAdd' => true]); // now, as you have set $config->webpOptions['useSrcExt'] to true, // every webp variation simply has to get a trailing ".webp" to the jpeg URL: ?> <source type="image/webp" srcset="<?= $imageX1->url . '.webp' ?> 1x, <?= $imageX2->url . '.webp' ?> 2x"> <source type="image/jpeg" srcset="<?= $imageX1->url ?> 1x, <?= $imageX2->url; ?> 2x"> Given that the issue you encountered is located in the files / images sub module for webp, the auto refresh now should be working, as you use the older, (but clumpsy) method in the base imagesizer engine. Have not tested this for quite a few (master)versions, but hopefully it is still true. 🙂 PS: De hartelijke groeten aan de naburige stad! 👋
×
×
  • Create New...