Jump to content

Leaderboard

Popular Content

Showing content with the highest reputation on 05/23/2023 in all areas

  1. Just stumbled over this... https://www.slant.co/topics/5409/~php-cms
    4 points
  2. @bernhard Usually ProfilerPro would be great for this, but in some cases I can't use it because ProfilerPro is a module and I'm timing something that happens prior to modules loading. When PW is in debug mode, it times all of the boot processes, so I can just look at those. I also add other Debug::timer() calls as needed to time specific things. When testing in the admin, you can finish a timer with Debug::saveTimer($timer); and it'll show the result in the Debug > Timers section of the admin. But you can't look at any one instance of a timer. You've got to take several samples before you really know if there's a trend in the numbers. I'm usually looking for consistently 10 ms or more when it comes to improvements. @Ivan Gretsky The cache supports an expiration flag represented by the WireCache::expireReserved constant. It means that rows with that flag are reserved and should never be deleted. If you use the $cache API to clear the cache, it'll do it in a safe way. But if you just omit the caches table from a DB dump or manually delete all the rows in the caches table, then it would be problematic. I agree it would be better not to have to consider this. I'm not sure we need to keep that flag other than for modules, so this is one reason why I'm looking to have the modules use some other storage method for its cache. Though if new caching options came along (Redis, etc.) it would sure be nice for modules to be able to utilize that caching method too, so there are tradeoffs. Ideally, the modules would still use WireCache, but be able to recover easily if its caches get deleted, so that's probably what I'm going to work towards, but it's not as easy as it sounds. @teppo Sounds great! I have no experience with Redis but this server seems to have it and I've also been curious about it. I really like the idea of dedicated cache systems independent of the DB or file system. I'd definitely like for PW to be able to support them, and am glad to hear you are interested in it too.
    3 points
  3. Hi, I upgraded 100+ sites to the latest master on Friday, and we've gotten the following error email notification from a few of them over the weekend: Umm… Error: Exception: Unable to obtain lock for session (retry in 30s) (in processwire/wire3/modules/Session/SessionHandlerDB/SessionHandlerDB.module line 96) User: ?, Version: 3.0.184 This is a new error, added in the latest master release: https://github.com/processwire/processwire/commit/7a2ff6c15dde8f6768676181bfbeeaefe4761b0b#diff-4c158c18e5f331d4e9ff8b27eff8ae2c4cd34d16f6d0f2b427aa36791637c64f The session lock time is set to 50 seconds, I think this is the default, so I suspect the issue is on our end with our databases/server(s). I'm going to investigate further and try and figure out the issue and will update here if/when I do. If anyone else has come across this or has any more info, please let me know! Cheers, Chris
    1 point
  4. @Roych - are you using the original (https://github.com/marcostoll/processwire-fieldtype-assisted-url), or my fork: https://github.com/adrianbj/processwire-fieldtype-assisted-url ?
    1 point
  5. @netcarver Yes, I've created a gist that should do.
    1 point
  6. Hi, and welcome to the forums! Note that questions about ProFields are best posted in the ProFields Support forum (https://processwire.com/talk/forum/28-profields-support/, in the VIP Support section). Meanwhile, perhaps this blog post will help with your question: https://processwire.com/blog/posts/new-repeater-and-repeater-matrix-features/#matrix-type-groups Or this forum post:
    1 point
  7. Hi @Boost Take a look at the family tab in the template config page. And check the "Can this template be used for new pages." Select "No" and save then voila. Gideon
    1 point
  8. Ok, so my hunch with what was going on in the SessionDB handler was wrong - thank you for trying it anyway.
    1 point
  9. Hi @franciccio-ITALIANO I suppose url_bg and box_url_img_SX both are image field. If it is the case, your code should be like <script src="<?= $page->url_bg->url ?>"></script> or: <img style="padding-left: 15px;" src="<?=$page->box_url_img_SX->url ?>" alt="<?=$page->box_img_alt1?>"> Gideon
    1 point
  10. Great - thank you for trying it! I'll see if I can get some changes into a PR for Ryan - hopefully this weekend.
    1 point
  11. I can confirm that with your module I do not get the error.
    1 point
  12. @gebeer Could you try my spin on the SessionHandlerDB module that I posted here, and let me know if that makes any difference?
    1 point
  13. I solved it, the problem was what you said, thank you!
    1 point
  14. @bernhard you might want to check this one out as it works perfectly well with the free ChatGPT 3.5 (GPT 4 is acting weird with this prompt). https://flowgpt.com/prompt/qGQmSnF-MDsfhDhpfzZEM Copy the whole prompt Send it and wait a moment Explain what you want to accomplish Answer questions or add details if needed Ask GPT to write the code You have to get used to it as it's quite verbose sometimes.
    1 point
  15. @netcarver I've been using Latte in Laravel almost exclusively. Blade is great, but Latte is even better. There's a composer package for integrating Latte, but it was out of date when I last checked so I wrote my own which was quite straightforward. Interestingly, you can mix and match, calling Blade views from Latte and vice versa, which is a lifesaver when dealing with vendor files and third-party packages.
    1 point
  16. Thanks for the link, Bernhard, I've been trying blade in Laravel recently, but think I need to try out Latte as well.
    1 point
  17. Very interesting blog article that compares twig / blade / latte: https://blog.nette.org/en/quiz-can-you-defend-against-xss-vulnerability
    1 point
  18. I am finetuning existing pre-trained models and using langchain toolchain. (About the later, I suggest you to try and keep an eye on a fork of PrivateGPT which is @su77ungr/CASALIOY). I like to be able to run it without the need to plug Internet wire, which is the most important requirement, and I also got really good results with CASALIOY after ingesting a small part of the company's knowledge base. I made some years ago a license plate recognition system deployed on our parks and I am able to ask a basic question in the context of our proprietary softwares and get a response like "Blabla you need to send this MSG_LPR_.. Windows Message with this LParam and WParam, the block will answer you the current amount due by the client in a JSON string stored in WParam... ". It can also explain what a settings do along with real example context, eg., "If the setting `blabla` is set to true, when a car approach, if it's a VIP or is annual fixed bill is paid, the barrier is opened...". It's really astonishing. I am using Vicuna-13b, not GPT, you can find more infos there: lmsys.org Last week I experimented a model called YOLOv7, which is an algorithm for detecting objects in an image. The challenge for us (I got it almost working) is to detect vehicle type in real-time to apply our logic, example, detecting taxis, ambulance, truck and opening the barrier. Look at that : Above, the AI is triggered in real-time on the picture sent by an IP camera already used in our LPR system. You might want to read this paper: https://arxiv.org/pdf/2212.10560.pdf You will find example of how to implement it on Github. After the release of my ongoing project, I had in mind to try to launch a project if Ryan and the mods team consent to, which consist of scrapping the entire forum and build a chat bot, I find it funny to build it, and I would like to see how good will be answer to issues like the ones that recently popped again, yo know, the "the forged session thing", "backend login crash expiration", etc, more than focusing on the core base code of the tool. We might collaborate to learn together 🙂 Just adding that the main issue, is the hardware power...
    1 point
  19. I've just released a new version with a ton of updates 😎 https://github.com/baumrock/RockMigrations/releases/tag/v3.25.0 I started to adopt a new workflow, because I often heard that "it's hard to keep track". That's why from now on I'll push updates to the dev branch and then merge them to main from time to time. Highlights: I improved the deployment log a lot, made everything a lot cleaner and tried to catch all the warnings that where just written to the log before. Old: New: We also have a new tweak to show page ids in the page tree (1) and a helper method to add labels in the page list (2) and also the template name (not in the screenshot) Check out https://github.com/baumrock/RockMigrations/releases/tag/v3.25.0 for all updates!
    1 point
  20. Hi @franciccio-ITALIANO, it seems, that you always have the same ID "block01" for all your modals, so only one modal will always be opened? You should make the ID dynamic. <?php foreach($page->box_ripetitore as $b => $boxrip):?> <!-- INIZIO BLOCCO RIPETUTO --> <div class="<?php echo $boxrip->box_ripetitore_colore?>"> <div class="uk-tile" id="op31h2"> <a href="#block<?=$b?>" uk-toggle> <h2 style="font-family: Anton; font-size: 3em; color: white;"><?php echo $boxrip->box_ripetitore_tit?></h2> </a> <div style="font-family: Oswald; color: gold; font-size: 1.5rem;"><?php echo $boxrip->box_ripetitore_sottotitolo?></div> <!-- SEZIONI FOGLIA --> <div id="block<?=$b?>" class="uk-modal-container" uk-modal> Regards, Andreas
    1 point
  21. I couldn’t resist the hype and created a simple module using the ChatGPT API to process field values. You can find it on GitHub here: https://github.com/robertweiss/ProcessChatGPT ProcessChatGPT is triggered upon saving, if you select it in the save dropdown. It processes the value of a page field, which can be set in the module config, using ChatGPT. The processed value can be saved back to the same field or to another field on the same page, which can also be set in the module configuration. You can add commands to the value that will be prefixed to the source field content. This way, you can give ChatGPT hints on what to do with the text. For example, you could add ›Write a summary with a maximum of 400 characters of the following text:‹. One of my clients is already using the module to summarise announcement texts for upcoming music events on their website (Let’s face it, nobody reads them anyway 😄). If anyone finds it useful, I would be happy to submit it to the official module list.
    1 point
  22. As a follow up, I dropped that plan after I read about chatGPT plugins, specifically the retrieval plugin and then watched This seems a better way to go. But, then again, after watching I realised what a big effort it is to initially collect the data before we can provide it to chatGPT via the retrieval plugin. Definitely not something that I would be up to on my own. But at least I learned something 🙂 Apart from that, the more I read about the company openAI the more I realise that I don't want to support them. Being a proponent of the open source spirit, I'll better look deeper into really open projects like https://github.com/nomic-ai/gpt4all and see how I can get a fine tuned model for programming with PW. Like @gornycreative said earlier, I also think the future lies in more specialized models. Anyways, still a long way to go until we have chatPW 🙂
    1 point
  23. ChatGPT is soon to replace Google, Stackoverflow and Quora for answering questions like these. I have used it to either confirm or throw speculation on topics and directions of research. I have asked it very technical academic questions in certain fields and it has given balanced views for just about everything, including outsider views and newer untested views on certain problems. Very fun. It has even been able to tell me when a problem that I thought still existed has already been resolutely solved. Such a time saver! My main comment, though, is that the marriage of page classes and migrations is truly groundbreaking in terms of modular design and in particular the ability to bring over single files that can instantly put in (or remove) scaffolding for demos. This has been AWESOME for live brainstorming and troubleshooting or providing clients with examples of how extensible the system is and how quickly modifications and improvements to functionality can be built out. This has made me more excited about using RockMigrations - I was a bit on the fence when the automation was centralized in migrate.php because of how unwieldly automation scripts can get in other applications I use - but bringing things that involved scaffolding into their respective page classes allow me to really just use migrate.php for mod installations, configuration, general hooks and other core modification processes. This is way easier to navigate, share and especially train on. This method is very easy to coach new devs on also.
    1 point
  24. v2.15.1 ignores module migrate files if the corresponding module is not installed 😎 And it will trigger the migratefile right after the module has been installed! Thx for your input and feedback guys! Keep up bringing good suggestions 🙂
    1 point
  25. Hi @Kiwi Chris I've added a config setting to prevent migrations if you don't want them to be triggered automatically: $config->noMigrate = true; https://github.com/baumrock/RockMigrations/#running-migrations For adding this setting only to your local dev setup see https://processwire.com/talk/topic/18719-maintain-separate-configs-for-livedev-like-a-boss/
    1 point
  26. You can do that. RockMigrations will only do what you tell it to do. You can also do that and sometimes I do it myself. It's just a reminder that if you apply changes that are somewhere in code of a migration your changes will get overwritten. If you apply changes via GUI that are not set in any migration than you'll be fine and your manual changes will be kept. That indicator could be improved I guess. For example it could only appear on fields or templates that received changes from a migration. And changes done via RockMigrations could be listed instead of the generic warning. If that is an important feature for you (or anybody else) I'd be happy to merge any PR in that direction and provide all the help needed.
    1 point
  27. Another thing is, you can re-use custom page classes in new projects and have their fields/templates created automatically. Develop them only once, copy them over to a fresh PW install, and there you go.
    1 point
  28. What do you struggle with @Ivan Gretsky? I use RockMigrations for every project right now. You got version control for your fields and templates, and even more. So if you are using git and develop new features, it is of great use because it adds the needed fields/templates when switching branches. Also in conjunction with RockFrontends livereload feature, this is a lot of fun. You add a field in the migrate.php (or somewhere else where you need it, like a custom page class) and the page in the admin autmatically shows the newly added field. This made my workflow so much quicker. I don't use the deploy feature (github actions) because I got my own github actions, so I can not say anything about that.
    1 point
  29. Thx 🙂 Yeah that's the usual way of doing migrations. I started like this at the very beginning, always creating a file with an upgrade() and a downgrade() method. But the declarative syntax is just so much better! It's easier and a lot faster to write (develop) and it's easier to read (understand) and it's also a lot easier when looking at git diffs: You instantly see what I did here: Renamed the field constant from field_persons to field_personimages and added the field_personmatrix below the field_footermenu. And you can simply go back and forth in time by using your IDE: You could do that. The easiest way is to put everything into migrate.php; It's the same as with hooks. Where do you place your hooks? All in ready.php? Maybe you started like this. But likely you'll not do so any more and have most of them in dedicated modules where you bundle things that belong together. As you said that has the huge benefit of making things reusable. Since I've adopted that approach I'm developing faster than ever before. But to get started just go ahead with migrate.php 😉 Regarding your question about one file for fields and one for templates. You can do so. It's maybe a matter of preference. But I prefer to put everything in modules or pageclasses. Let's say we create a blog. We need a "blogitem" pageclass and that pageclass uses fields "title", "date", "content": <?php namespace ProcessWire; use RockMigrations\MagicPage; class BlogItemPage extends Page { use MagicPage; public function migrate() { $rm = $this->rockmigrations(); $rm->migrate([ // field migrations 'fields' => [ 'date' => [...], 'content' => [...], ], // template migrations 'templates' => [ 'blog-item' => [ 'icon' => '...', 'fields' => [ 'title' => ['columnWidth' => 70], 'date' => ['columnWidth' => 30], 'content', ], ], ], ]); } } Do you understand what it is doing? At first sight? Or would it have been easier if the migrations lived in fields.php and templates.php ? Also a thing to keep in mind is that sometimes the order of execution matters. And that's easier to handle if you place your migrations in places that belong logically together. That situation often arises when I'm writing migrations for parent/child template relations. The concept is as follows: Have a master module (I recommend having a Site.module.php for every project and the default migrations profile does that for you 🙂 ) Load pageclasses from the master module (eg BlogItems and BlogItem) Trigger migrations in every pageclass (RM has helpers for that, see https://github.com/baumrock/RockMigrations/wiki/Ship-your-Module-with-Custom-Page-Classes ) That takes care of creating all fields for the template and adding fields to the template (like the example above) THEN set parent/child relationship (both templates need to exist before we can set this relation!) This makes the backend now that if you create a new page under "BlogItems" you are only allowed to add a single "BlogItem" Does that make sense so far? RockMigrations runs on every single request. Then it checks all the files that have been added to the watchlist (migrate.php is added by default, others can be added via $rm->watch(...)) and it will run the migrations for every file in the watchlist that has been changed since the last run of migrations. So if you had 30 files watched and you change one (let's say "BlogItem.php"), then only migrations of this file will be migrated. All others will be skipped. You can inspect that easily via the tracy logs panel where you see what is going on and in which order: Now if you combine that with RockFrontend, then you get instant reloads of the backend and therefore instant migrations while you code. So great, you have to try it! If you run migrations from the cli or you do a modules::refresh then all migrations are run (just in case the watcher does not catch up a file or something goes wrong): If you still have any questions let me know 🙂 Also I'm happy to do consulting via online meeting if you have questions specific to a project (or want to support me and the development of my modules).
    1 point
  30. In the past I had used the old version of RockMigrations for some simple tasks like adding fields and templates which turned out to be huge time saver. But only just recently I started to use the new version and to discover the possibilities of this tool in more depths. And what should I say, I am really amazed. Once I had grasped the core concepts and, with @bernhard's help, had learned how to use it in a more structured way and in different contexts, it turned out to to be an even bigger time saver. I hate repetitive tasks. Adding fields and templates and fields to templates and configuring labels / access rights etc for fields in the template context in PW is a pleasure when using the GUI. But it is still a repetitive task. With RockMigrations I can have definitions of these fields / templates and fields in template context in a single file and reuse that. In a matter of seconds I have my reusable base structure available to start off a new project. Adding new structure through the GUI in a staging environment and then having to replicate it on the live sytem through the GUI. Repetitive task again. Pushing a migration file to the live system and running the migration. Again in a matter of seconds. Writing migrations wherever I want is also a great feature. Be it inside site/migrate.php or inside a newly developed module, it doesn't matter. RockMigrations will find it and do its job. At the beginning I wasn't sure how to define different field types with all their different properties. RockMigrations comes with VSCode snippets that make things easy. Or you can create a field in the GUI and then just copy/paste the code for the migration from the GUI to your migration logic. So however you prefer to create your fields, RockMigrations has you covered. This post may sound like an advertisement. But I just wanted to express how happy I am after having made the decision to spend some time to learn how to work with this module. That time was definitely well spent. Big thanks to Bernhard for creating this and releasing it as a free module.
    1 point
  31. Or just use {dump $page->title} I think you can drop the parentheses after items or content. Also you might want to use a n:if clause on the ul, so the markup is discarded if there are no items. <ul n:if="$block->items"> <li n:foreach="$block->items as $item"> <a href="#">{$item->title}</a> <div>{$item->content|noescape}</div> </li> </ul> I am also using Latte in an actual project and love it so far. For the integration into ProcessWire I am using TemplateEngineFactory. It also has a Latte renderer (but with Latte 2 atm, actual version is 3).
    1 point
  32. Sure! Here's a few that might be of use to ProcessWire sites: <?php use Latte\Engine; /** * Latte filter provider * */ final class Filters { /** * Install available filters. */ public static function install(Engine $latte) { foreach ((new static())->provide() as $name => $callback) { $latte->addFilter($name, $callback); } } /** * @return array<string, callable> */ public function provide(): array { return [ // Sanitize values using ProcessWire's sanitizer API variable 'sanitize' => function ($value, $sanitizer, $options = null) { if (!$options) { return sanitizer()->$sanitizer($value); } else { $args = func_get_args(); unset($args[1]); // remove $sanitizer arg return call_user_func_array([sanitizer(), $sanitizer], array_values($args)); } }, // Render FormBuilder form, but allow prepending and appending fields 'form' => function ($form, $options = []) { if (!$form) { return ''; } $output = $form->render(); if ($options['append'] ?? false) { $output = str_ireplace('</form>', $options['append'], $output); } if ($options['prepend'] ?? false) { if (stripos($output, '<form') !== false) { $output = preg_replace('#(<form[^>]*>)#i', '\\1' . $options['prepend'], $output); } else { $output = $output . $options['prepend']; } } return $output; }, // URL slug / page name 'slug' => function ($str, $length = 128) { $str = sanitizer()->unentities($str); $str = sanitizer()->pageNameTranslate($str, $length); return $str; }, // Truncate string 'truncate' => function ($str, $length = 200) { return sanitizer()->truncate($str, $length, ['visible' => true]); }, // Render Markdown 'markdown' => function ($str) { $str = "{$str}"; modules()->get("TextformatterMarkdownExtra")->formatValue(new Page(), new Field(), $str); return $str; }, // Unwrap ProcessWire value objects 'value' => function ($object) { if (is_object($object)) { if ($object instanceof SelectableOptionArray) { return $object->implode(' ', 'value'); } else { return $object->value ?? ''; } } elseif (is_array($object)) { return implode(' ', $object); } else { return (string) $object; } }, // Join a string with a custom separator at the end 'join' => function ($list, $separator = ', ', $lastSeparator = ' & ') { if (count($list) > 1) { $last = array_pop($list); return implode($separator, $list) . $lastSeparator . $last; } else { return implode($separator, $list); } }, // Prettify URL by removing protocol and www 'prettyUrl' => function ($url, $options) { if (null === $url) { return false; } $url = trim($url); if ($options['www'] ?? true) { $url = str_replace('www.', '', $url); } if ($options['http'] ?? true) { $url = str_replace(array('https://', 'http://'), '', $url); } if ($options['slash'] ?? true) { $url = rtrim($url, '/'); } return $url; }, ]; } }
    1 point
  33. Need a language switcher? <div class="..."> <a n:foreach="$languages as $lang" href="{$page->localUrl($lang)}" class="..." > {$lang->title} </a> </div>
    1 point
  34. Cross-posting in case any Wireframe user wants to give Latte a try: there's a renderer for it now. Somewhat experimental, since I have a rather miniscule understanding of Latte myself ? Slightly annoying marketing shenanigans aside (I didn't know how to do a single thing in Latte without reading the docs, "Latte is the only system with an effective defense, thanks to context-sensitive escaping" is a silly thing to claim, etc.) Latte does look quite nice. I dig the n:attributes in particular, they remind me a lot of the AngularJS (v1) syntax. Apart from that, Latte seems largely the same as other engines/languages I've used (e.g. Twig, Blade, and Dust). Admittedly I've just scratched the surface, so there's likely a lot more there ? I'm not sure yet whether it's a good thing or not that Latte syntax is so close to plain PHP, just with auto-escaping enabled and <?php ?> replaced with curly brackets. Among other things a) the PHP syntax isn't necessarily the easiest to grasp or nicest to look at, especially for non-developers (though this is admittedly highly opinionated), and b) there's a slight fear in the back of my head that this actually makes it extra tempting for developers to put unnecessarily complex code within their views. So far among the templating languages I've used Blade has been my personal favourite. It's easy to grasp, familiar to PHP users but also for everyone who's ever used another templating language (like Twig), has some very handy shortcuts available (particularly when it comes to loops), the syntax looks nice and clean (in my opinion), and components in particular are just brilliant in terms of both implementation and use. But enough advertising ?
    1 point
  35. Another nice one from today: Imagine you are on a blog overview page and you want to provide a link for your clients to directly add a new blog post (if they are allowed to): <a n:if="$page->addable()" href="{$pages->get(2)->url}page/add/?parent_id={$page}" > Add new blog-item </a>
    1 point
  36. bd() and d() - I've learned SO much about ProcessWire and PHP and OOP in general, can't thank you enough @adrian console - it's a dream to work with! request info panel - invaluable, especially when working with RockMigrations user switcher - handy refresh feature - modules refresh not moving away from current page That one is also great for working with RockMigrations: Usually I create a migrate() method that is triggerd on modules refresh. Then I can add a new field for example to a page and when I'm editing this page, I add the new field in code and just hit "modules refresh" via tracy and I end up on the same page edit screen just with a new field ready to be populated ? Oh and I've been using adminer for backup/restore lately quite often...
    1 point
  37. This is a great topic! I would love to learn more about the features of Tracy Debugger. I did start to read the docs a few times, but never had time to proceed)) As for me, I actually do not have a lot to share, as I only seem to use: the great bd() and the dumps recorder, and the bluescreen (you can't avoid that, but it is really helpful anyway), and the console once in a while to test selectors or something minor like that, and the donate button every time I install TracyDebugger on a site, ...but nothing more advanced, really. Would love to learn new tricks though)
    1 point
×
×
  • Create New...