Jump to content

Leaderboard

Popular Content

Showing content with the highest reputation on 02/09/2023 in all areas

  1. I wanted to share this tip which I'm going to implement on Transferware Collectors Club Database. That site has a form which is a set of filters. It takes many queries to build that form and all of its options and as a result, it takes over 5 seconds to load the page. ProCache is out of the question because using a full page cache on a site like that doesn't make sense, but caching various areas with WireCache $cache does. The form is developed in a way where it will automatically populate/select the various form fields based on what it reads from the URL. So going to /?color=123 will automatically check the appropriate radio button for that color. Therefore, when using $cache to cache the form (or parts of the form), we need to cache the form in a clean, unpopulated state. The problem then becomes how do we use the cached form but also populate the fields accordingly based on the URL variables? Markup regions can do this. Assuming we had a set of radio options coming from our WireCache'd form like this: ... <input type="radio" name="color" id="color-123" value="123"> Red <input type="radio" name="color" id="color-124" value="124"> Blue ... We could devise some code that also outputs this on our page: <region pw-replace="color-123" checked></region> Markup regions will then "check" that radio. Perhaps this isn't mind blowing, but I've never thought of using Markup Regions in this way.
    4 points
  2. You don't need a module, it can be done by a hook in the ready.php and an warningpage (with own template). -> https://processwire.com/api/ref/session/login-success/ $this->addHookAfter('Session::loginSuccess', null, function ($event) { if ($this->wire('user')->hasPermission("specific_role")) { $session->redirect($pages->get("/warningpage")->url); } }); On the "warningpage" log the user out with this code in the template: $session->logout();
    3 points
  3. Check your config file for something like this: $config->prependTemplateFile = '_init.php'; $config->appendTemplateFile = '_main.php'; I think this will be the line with the _main.php 😉 This is a standard configuration in the Regular Site Profile
    2 points
  4. These last few weeks I've been working on integrating a ProcessWire installation with the Fareharbor API for a client. Other than the authentication part (which is as simple as it gets), I've found this API to be one of the more time consuming ones to work with. It's not so much that the API is difficult to use, as much as it is just a time sink, taking a long time to reorganize the info it provides into something useful for our needs. And likewise taking a long time to prepare information to put back into it in the format it requires. My best guess is that it is an echo of an existing back-end API, projecting internals rather than tailoring a simpler public API to them. Perhaps it's an interface optimized for the some internal legacy system rather than the external consumers of it. Or perhaps it already is a lot simpler than what's behind it, and its interface has been carefully considered (even if it doesn't feel that way), who knows. To be fair, no API is perfect, and this particular API does provide a working and reliable interface to some pretty complex data, and an immense amount of power. It's good to work with lots of different APIs, from easy-to-painful, as it helps to clarify paths to take (and to avoid) when authoring new APIs. I ended up building an adaptor module in ProcessWire just to give this particular API a simpler interface that was more useful to the needs we had, and that is now saving us a lot of time. It reminded me of one reason why ProcessWire was built in the first place, to create a simple interface to things that are not-so-simple behind the scenes, and I think we've been pretty successful with that. We'll keep doing that as ProcessWire continues to mature, evolve and grow, as we always have. In terms of core updates, commits this week were similar to those from the last couple of weeks: a combination of issue fixes, a PR, feature requests and minor improvements. We are now 17 commits past 3.0.211, but I'm going to wait till next week before bumping the version to 3.0.212, as there's a little more I'd like to add first. Thanks for reading this update and I hope that you have a great weekend!
    1 point
  5. Thanks @bernhard - that's a simple way of doing it thanks. Could have sworn I saw a module for organising pages in folders.
    1 point
  6. Aren't your prepend/append files listed within the template settings (Files tab)? That would be the easiest way.
    1 point
  7. <?php if($page->id == **YOURLANDINGPAGEID**) return; ?> Just Add this to the top of your _prepend.php and _append.php (and isert your id 😉) Of course, this also works with a template check: <?php if($page->template == "**yourtemplatename**") return; ?>
    1 point
  8. I wonder what is currently the best way to ship a module that defaults to english let's say with german translations? I think the translation files have to be uploaded to the relevant language page (eg german) and are then stored in /site/asstes/files/my-german-page-id/... What if I want to ship a module with translation files included? That's currently not possible, is it? I wonder if the PW core should be modified to not only look for translations in /site/assets/files/german-id/my-module-transation-file.json but also in /site/modules/my-module/translations/german/my-module.module.php.json What do you think? Am I missing anything?
    1 point
  9. This feature was added lately ? Docs can be found here: https://processwire.com/blog/posts/pw-3.0.181-hello/
    1 point
  10. If you have a robots.txt, I would use it to specify what directories you want to exclude, not include. In a default ProcessWire installation, you do not need to have a robots.txt at all. It doesn't open up anything to crawlers that isn't public. You don't need to exclude your admin URL because the admin templates already have a robots meta tag telling them to go away. In fact, you usually wouldn't want to have your admin URL in a robots file because that would be revealing something about your site that you may not want people to know. The information in robots.txt IS public and accessible to all. So use a robots.txt only if you have specific things you need to exclude for one reason or another. And consider whether your security might benefit more from a robots <meta> tag in those places instead. As for telling crawlers what to include: just use a good link structure. So long as crawlers can traverse it, you are good. A sitemap.xml might help things along too in some cases, but it's not technically necessary. In most cases, I don't think it matters to the big picture. I don't use a sitemap.xml unless a client specifically asks for it. It's never made any difference one way or the other. Though others may have a different experience.
    1 point
×
×
  • Create New...