Jump to content

adrian

PW-Moderators
  • Posts

    11,150
  • Joined

  • Last visited

  • Days Won

    368

Everything posted by adrian

  1. Ok, so I decided to play around with writing a DOMDocument based approach which hooks into Page::render and replaces all: <img src="/site/assets/files/1234/image.png" alt="my image" class="my_image_classes"> with: <picture class="my_image_classes"> <source srcset="/site/assets/files/1234/image.png.webp" type="image/webp"> <img src="/site/assets/files/1234/image.png" alt="my image" class="my_image_classes"> </picture> Because it happens on page render, this approach works with images embedded into RTE fields, as well as those added via regular <img> tags in template files. It automatically copies classes from the original <img> tag to the picture tag. I still don't know whether this is the best approach or not, but so far so good ?
  2. Thanks for your thoughts - I must admit I am a little behind in adopting webp, but it's time to get up to speed for sure. I would love to say that I could get clients to optimize before uploading, but it's just not going to happen for all of them and unfortunately they often upload images in the MBs in size, not KBs, so I definitely need to do compression of some sort. I see some sites where they are displaying thumbnails of images which are well over 1MB and even on my 150 Mbps connection, they still visibly load quite slowly so I think it's still important to consider this. Please also consider that in some countries (especially here in Canada), mobile data is stupidly expensive, so even if it's fast enough, it's eating into valuable monthly bandwidth if you're not on WiFi. Thanks for the suggestion of going with Strategy 3 for WebP - I am curious about images inserted into an RTE - do you use DOMDocument (or preg_replace) to scan for <img> tags and automatically wrap them in <picture> tags and add the <source srcset> tag or do you have another strategy for this? Thanks again!
  3. Yes, I am using the "Optimize on Pageimage->url() [EXPERIMENTAL]" option - it's the only way that makes sense to me to actually use this module because I don't want to resize the original and I don't always do a size() or crop() call on images. I've commented that log line for now, but I am curious about your thoughts with webp - are you suggesting that if webp is implemented, then there is really no point using this module at all, or just don't use the experimental feature?
  4. @matjazp - I think this is a relatively new thing - I have noticed that the autosmush.txt file is getting a LOT of "Unsupported extension svg" entries. I have an SVG logo in the header of the site, so every page load is triggering this. Just wondering if there could be an easy way to prevent these from being logged? Thanks.
  5. @Typografics - would you mind doing a little debugging and see if the problem occurs with all the custom panels disabled? Also does uninstalling and reinstalling Tracy help?
  6. @Typografics - I'd be curious to know if enabling Tracy again brings the problem back because I haven't heard of anything like this before, but would obviously like to fix it if there is a problem.
  7. FYI - https://www.smh.com.au/business/companies/google-says-goodbye-to-the-cookie-monster-increasing-user-privacy-20200115-p53rj5.html
  8. @tires - sorry this module really isn't seeing much love these days. Pete is long gone and I have never needed to use it so it's hard for me to continue supporting it, especially given that it really needs a new mail library among other things. To get your site working again, I think you probably just need to delete the email with the arabic characters in the subject that is causing the problem. If that's not available, I think you should be ok. It probably shouldn't be too hard to fix the bug though either but we'd need to know what $message['headers']['subject'] is returning - actually, dumping $arr just after line 536 would be best.
  9. Are you talking about PHP's "safe_mode" ? I am thinking not because it hasn't been available since php 5.4. Sorry is there another safe_mode setting I'm not thinking about?
  10. Thanks @flydev ?? for your hard work on this. I installed on a Debian server and had a few issues which I have managed to fix. I submitted a bit of a messed up PR - I think we were both working at the same time. All my tests were in native mode 1) The server must have the "zip" package installed - mine didn't which caused me some grief for a while. I suggest looking for this and warning if not installed. 2) I had to change the mysqldump command quite a bit - you can see in the PR, but key things were --single-transaction and specifying the DB and removing the --skip-lock-tables 3) Had to chmod the .sh script to 744 so it was executable, but obviously this may not work on all servers depending on the owner etc. 4) Had to change the directory and the exec call to actually execute the .sh file - it was failing to run otherwise. With those changes everything seems to work great, although I did notice that after running a duplication via the Process module, it no longer reloads the page when done so you don't see the new package unless you manually reload when it's finished. Thanks again!
  11. Hi @Knubbi - can you please test the attached version of that action? If it works as expected for you, I'll commit it to the module repo. FieldSetOrSearchAndReplace.action.php
  12. Glad that worked but I expect it will likely come back for you so it would be good to figure out what the fix will be. Please keep an eye out for it again.
  13. Thanks @rick - don't worry about those VSCode reported errors - it just doesn't know about those classes, but they work as expected. I think it's likely that you have some non-escaped character or a character encoding issue in your log files that is causing the problem. Could try a mb_convert_encoding() on $logLinesData before it is sent to WireCache on line 44. If that doesn't work, maybe instead try utf8_encode() or htmlentities() or something else along those lines. There is this old topic: but I don't think it will help us much here.
  14. Hi @rick - I'm not sure why that would happen but it would be helpful if you could dump the content of $data before line 449 in WireCache.php so we can see what is being passed to json_encode so we can see why it might be failing. You could/should probably also check the value of $logLinesData before line 44 in ProcesswireLogsPanel.php Please let me know what you find out.
  15. Thank you! FYI - after setting innodb_buffer_pool_size based on that calculation query, my duplicator cron failed so I almost feel like that made things worse. I am beginning to wonder if InnoDB just needs more resources than MyISAM and my VPS server just doesn't have enough oomph ?
  16. Thanks for the detailed info. I must admit to being pretty new to InnoDB and actually I am glad you mentioned it, because the other site with the bigger DB which is having no problems at all is on MyISAM, so I guess that is the difference. I have already played with a few of those settings you mentioned, but nothing consistently helped. The only thing that seems to be working is the change to set PHP time limit and max execution time to both infinite. It's also really confusing that this is only an issue when run via cron - it makes me think there is a significant difference in PHP or MySQL settings. Obviously PHP has a CLI specific .ini file, but I don't think there is anything like that for my.cnf. Maybe Duplicator should look into using WireDatabaseBackup's exec mode: https://github.com/processwire/processwire/blob/321ea0eed3794c5f2b50c216b603fad3e7347ce6/wire/core/WireDatabaseBackup.php#L131-L133 - any thoughts on trying that? Regarding your suggestion of disabling Tracy - I have been thinking for a while about having an option to disable Tracy when php_sapi_name() shows the script is running from the CLI. I haven't added it yet, but certainly could, although I doubt that's the cause of this issue. I am going to keep monitoring the daily cron over the next week to see if it works every day with those set_time_limit(0); ini_set('memory_limit', '-1'); settings and if it does, at least that will tell us something ? Thanks again, Adrian
  17. Thanks for the update @flydev ?? but please don't forget about the notice caused by the undefined $zipFilename on this line: https://github.com/flydev-fr/Duplicator/blob/475318d2893e07ab225c89d738efd59d06277546/Duplicator.module#L44
  18. Ok, I'm getting there ? @flydev - there is still one more notice in 1.3.13 - line #44 should be: $this->log("Logging {$logName}\n"); $logName, rather than $zipFilename which isn't defined. Regarding the mysql gone away error, it seems that has been fixed by doing this: //DUP_Util::setMemoryLimit(self::DUP_PHP_MAX_MEMORY); //DUP_Util::setMaxExecutionTime(self::DUP_PHP_EXECUTION_TIME); set_time_limit(0); ini_set('memory_limit', '-1'); I know this is not ideal, but it's time to move on so will have to do for now.
  19. Thanks @Autofahrn - that new version does fix those notices. Unfortunately I am getting MySQL gone away errors again when running via cronjob. I don't expect that is to do with the new version, but rather that it's a bit random and the tests after I changed the max_allowed_packet setting didn't actually fix it. The weird thing is that the error is being triggerd ~6 seconds into the duplicator process so it's not a timeout issue.
  20. Thanks @Autofahrn but that is the version I am using. Sorry, I take that back - I am using 1.3.12. Where do I find 1.3.13?
  21. @flydev - sorry, just noticed something new. Even those the Duplicate Process module page shows a valid package created from the cronjob, I am seeing these logged notices that suggest the package was not built correctly. Any ideas?
  22. @flydev - a couple of bugs and questions for you. Firstly, I am getting these notices with the current dev branch: Would be great if those could be cleaned up please. The other question is a weird one. With both the master and dev versions I have been getting "mysql gone away errors" lately on one site/server, but only from the cronjob (run from the system cron). If I do a Backup Now from the Duplicator Process module page it works fine. So it seems like there is some strange difference when run via CLI with the cronjob. The database size recently went over 128MB which is the size of my max_allowed_packet setting. I bumped it up and now it works from the command without the error. The weird thing though is that I have another server which has been working fine with a max_allowed_packet of 16M and the database is over 250MB. Both servers are Digital Ocean VPS. The one with the errors is running the latest version of Debian and the one without errors is on UBUNTU. The Debian server shows: Ver 15.1 Distrib 10.3.18-MariaDB, for debian-linux-gnu (x86_64) using readline 5.2 while the UBUNTU one shows: Ver 14.14 Distrib 5.7.28, for Linux (x86_64) using EditLine wrapper So I am wondering if it's a MariaDB vs MySQL issue, or the version difference or something else. Anyway, wondering if you (or anyone else) might have come across this recently. Thanks!
  23. But why - I don't really understand the problem. Arrays can have keys with uppercase characters - why are we ending up with the original and an extra entry with a key that is all lowercase?
  24. @nbcommunication - I finally figured out the extra send / count issue. We had 24 subscribers with uppercase characters entered in their email address. If I do a strtolower() on each address before building up the "to" and recipient variables arrays, everything works as expected. I am not sure if this is a bug in Mailgun, PW, or this module. Any thoughts?
  25. Just added a new Viewports panel based on a suggestion by @bernhard in response to this post: https://processwire.com/talk/topic/22617-sizzy-browser-developer-tool/ It's pretty basic, but gives you a quick and easy way to view and interact with your site at multiple screen viewport sizes all on the one page (panel) just by scrolling down. Currently there are 6 fixed sizes, but I can expand on these or make them configurable if you'd li Take a look and let me know what you think.
×
×
  • Create New...