Jump to content

Leaderboard

Popular Content

Showing content with the highest reputation on 03/22/2022 in all areas

  1. @fuzenco You may also use CURL, or maybe WireHttp, also a wrapper around CURL. BUt not really sure about wireHttp here! Otherwise the phpseclib for sftp is good to go, but maybe a bit to much overhead in this case, compared to CURL, if available? https://everything.curl.dev/usingcurl/scpsftp https://unix.stackexchange.com/questions/56261/automated-sftp-upload-with-curl
    1 point
  2. @monollonom Thank you! Well what can I say. I swear I tried this and it was not working. I gave it another try and after uninstalling and reinstalling the module it suddenly worked. Sorry! Will read the docs more carefully before I post next time. At least my code may help someone. public function ___upgrade($fromVersion, $toVersion) { $this->createBlock(); }
    1 point
  3. There is an “upgrade” function defined in the Module class you can use (same as “install” / “uninstall”). You don’t need to use a hook for this. https://processwire.com/api/ref/module/ (here’s an example in a module I made)
    1 point
  4. Same place where we've always been, and where we're ultimately going to be with any dependency management solution, automated or manual: risk assessment. Any time we depend on third party dependencies we're making a choice to trust the vendors of said dependencies (and by extension anyone they trust, and also anyone who might have access to the "pipeline" in between us and those vendors). Whether the benefit is greater than the risk is what matters. (Sorry for going a bit philosophical with this ?) As for Composer vs. npm, my opinion is that the situation with Composer is slightly less problematic, simply due to the ecosystem. I don't have numbers to back this, so please take it with a grain of salt, but in my experience relying on numerous interconnected dependencies is more common for npm projects. ProcessWire modules, for an example, tend to have relatively few dependencies, which can be a good thing (in this particular context). One solution, mentioned in that Reddit discussion as well, is roave/security-advisories. I would recommend adding it to your setups, just in case. It's a super simple package that defines a list of "conflicts", each of those being a package + version with known vulnerabilities. When running composer install or composer update, these conflicts crash the process. This may well be the only "positive use case" for conflicts I've come across ? It's not a fool proof solution, but a good start — and pretty much a no-brainer, in my opinion. Another thing that can help is private packagist, which provides automated security audits, alerts, and more. Private packagist is not free, but for commercial users it's definitely something to consider (and not just for the security features).
    1 point
  5. Awesome, thank you! I am getting ‘Error reported by web service: That module is not currently tracked by the modules directory’ when I try to update the status in ‘processwire/module/edit?name=ProcessDuplicator’ I was not in the correct module… sorry.
    1 point
  6. Hi guys, I’ve recently set up 2 PW installations for using WebP images (following this explanations, strategy 3). Both servers run on identical system configurations and PW versions (3.0.184). While integrating the WebP functionality was no problem at all, I’m massively confused by the results: one server works as expected, the other one does the sheer opposite. Server 1 (the good one): Images total: 326 WebP bigger than JPG: 41 (on average more than 40 %) WebP smaller than JPG: 285 (on average 30–40 %) Server 2 (the bad one): Images total: 862 WebP bigger than JPG: 773 (on average 30–40 %) WebP smaller than JPG: 89 (on average less than 10 %) As far as I know, the quality of the source JPG has an impact on the WebP: highly compressed JPGs may lead to hardly smaller or even bigger WebPs, while the savings with high quality JPGs tend to be more spectacular. Server 1 seems to confirm this assumption (the JPGs with bigger WebPs here are highly compressed 3rd party images) while server 2 ist acting completely strange. The source JPG’s size is around 1.200 x 800 pixel with a moderate compression rate and file sizes ranging between 100 and 500 kB with an average of 250 kB. The JPG quality on server 1 is about the same (regardless the 41 lousy ones), the only difference is their smaller size of 900 x 600 px with an average file size of 150 kB. So I’d consider the WebP use on server 1 as clearly progressive, while server 2 essentially limits itself to fill up the webspace with bigger images that will never appear on a display. Is there any influence on the WebP conversion I might have missed?
    1 point
  7. Hi @prestoav Just find your selected page in the page tree and there will be 'Unselect' option. See screenshot at the thread
    1 point
  8. Hello guys, is there a most recent version of this module?
    1 point
  9. Thanks to bernhard and Autofahrn i have come up with this example code and run it in TracyDebugger on a test page with a image field and it works beautifully. <?PHP /* get and save a new image to image field Pageimages array */ $page->of(false); $pageImages = $page->images->add('https://www.somesite.com/image_of_tree.jpg'); /* save the page (perhaps not needed but there for comfort.) */ $page->save(); /* get the last added image */ $lastImage = $page->images->last(); /* debug before changes */ d($lastImage, '$lastImage before changes'); /* add tags to the image and description */ $lastImage->addTag('test'); $lastImage->addTag('Tree'); $lastImage->addTag('Syren'); $lastImage->addTag('Sun'); $lastImage->addTag('Sunny'); $lastImage->description = 'This is a beautiful tree.'; /* debug info */ d($page->images, '$page->images'); d($lastImage, '$lastImage'); /* save the page */ $page->save(); ?> I used the following API docs and mentioned forums users help to accomplish this: https://processwire.com/api/ref/pageimage/ https://processwire.com/api/ref/pageimages/ https://processwire.com/api/ref/pagefile/ https://processwire.com/api/ref/pagefiles/ Just wanted to post this at the end so others who wonder about this could get a starting point.
    1 point
  10. Hello, in one of my templates I use url segments. As allowed url segments I have: regex:[0-9]+ regex:[0-9]+/.*$ regex:[0-9]+/[^"]+ When I use an url with the second/third pattern, it works fine as long as there are only letters and digits. But when the url contains a '+' or '%' (urlencode!) I get a blank page. .../1234/abcd > works fine ../1234/abcd+5 > outputs a blank page. ../1234/abcd%5 > outputs message: Bad RequestYour browser sent a request that this server could not understand. any suggestions? EDIT: I just found this from Ryan: https://processwire.com/talk/topic/580-encrypted-urlsegment/ In addition to the expected forward slash to separate directories, ProcessWire only accepts these [ASCII] characters in it's URLs: abcdefghijklmnopqrstuvwxyz 0123456789 . - _
    1 point
×
×
  • Create New...