Jump to content

poljpocket

Members
  • Posts

    76
  • Joined

  • Last visited

  • Days Won

    2

Posts posted by poljpocket

  1. @Gideon So I beg to differ on that assumption. Pageimages is a WireArray and thus will have a sort() function. See here: Pageimages class - ProcessWire API.

    @biber You can find the API for sort() here: WireArray::sort() method - ProcessWire API. There is no rsort() function, but sort() can still reverse the order like so:

    $images = $page->images->sort("-iname_".$order, SORT_NATURAL |SORT_FLAG_CASE);

    note the "-" (minus) in front of the field name.

    Here is another approach:

    $images = $page->images->sort("iname_".$order, SORT_NATURAL |SORT_FLAG_CASE)->reverse();

    This is using the reverse() function of WireArray, whose docs you can find here: WireArray::reverse() method - ProcessWire API

    • Like 1
  2. Hi all.

    I just ran into a very strange problem. The solution is below but I am just not sure if this is a 'feature' or a bug 😀!

    My lister bookmark for selector:

    'has_parent!=2, template=62, categories=1072, limit=25, sort=-modified, include=unpublished'

    gets changed into below selector whenever I am not logged in as a superuser:

    'has_parent!=2, template=62, categories=1072, limit=25, sort=-modified, include=hidden'

    Note the 'include=' part which is changed. EDIT: Template with ID 62 is 'product' in this case. I am referring to it by name instead of ID.

    From reading the source code, I can follow this exactly here:

    // if all specified templates are editable, include=unpublished is allowed
    if($numEditable == count($templates)) {
        // include=unpublished is allowed
    } else if($includeMode == 'unpublished') {
        // include=unpublished is not allowed
        if($showIncludeWarnings) {
            $this->resultNotes[] = $this->_("Not all specified templates are editable. Only 'include=hidden' is allowed");
        }
        $includeSelector->value = 'hidden';
        $changed = true;
    }

    This is taken from ProcessPageLister here. The way the $numEditable variable gets determined is as follows:

    foreach($templates as $template) {
        $test = $pages->newPage($template);
        $test->id = 999; // required (any ID number works)
        if($test->editable()) $numEditable++;
    }

    You can find this here, a few lines above the first snippet. The selector doesn't contain any parents, so these lines are the ones being used.

    Whenever I am logged in as non-superuser, I can also see the error message "Not all specified templates are editable. Only 'include=hidden' is allowed". So everything behaves just as intended.

    Now for the solution, which brings me to this possible bug:

    Adding access control to the template 'product' and enabling editing for the specific role solves the problem. The lister now shows unpublished pages and the error message is gone. So far so good. But: Before the change, the user could still edit the products because access was inherited automatically from the home template. Hence on the surface, the configuration is exactly the same and the user can do exactly what they could before. Only the direct access has changed which has an impact on the 'editable' check.

    The question now is, is this intended or am I looking at an edge case which isn't covered by the rather simple editability check using an arbitrary new page? @ryan: Could it be that Lister should add a parent to enable access inheritance in this check? I can imagine that this gets very complicated rather quickly...

  3. No not at all. Just for local development. EDIT: not at all = no, we don't use docker for deployments.

    Main reason: Most of our clients won't pay for the increased cost of a cloud platform.

    Second important reason: Most of our projects in the end are "just websites" and for that, the bells and whistles a shared hosting environment provides on the cheap are just too good to not use (e.g. email sending without a complicated configuration, managed configuration and automatic updates, and HTTPS support with auto certificate renewal).

    Also, you can look at some of the other docker topics I have answered on the forum, PW doesn't play too well with cloud environments (e.g. problems with session retention due to the proxying CEs use). I could be wrong, but I haven't digged into it more than I needed to in order to have a clean and simple local docker environment.

    • Like 2
  4. Hi Bernhard, lots of questions 🙂 I am happy to answer!

    44 minutes ago, bernhard said:

    How to you execute these snippets?

    Right now, we are using my module and in the not so distant future, probably the Migrations module.

    44 minutes ago, bernhard said:

    Who does it?

    Always me, the team lead. We are planning to use GitHub actions soon though.

    46 minutes ago, bernhard said:

    Did you think of any different approaches? Why do you think this has not been an issue for me so far?

    I think this workflow makes sense and I can't come up with a different approach. Sure, most of the migrations probably work just fine if executed several times and for those, we could use vanilla RockMigrations. I can even give you two good examples. Note that I can't share code since these are both commercial projects:

    Featured images
    Right now, we have an ongoing project where we as a first phase, finished the editing experience for a products list (not a shop, but many of the same features). We have included a product gallery and for the longest time, it was fine to just use the first image uploaded as the featured image. Then, we ran into some problems with images having unfortunate subject placements which made the first image with the same settings not compatible with the featured image concept as it was. This forced us to create a new image field where we can set another focus point and sizing constraints. Since the client had finished inserting 200+ products already, we simply cannot ask them to just upload the first image (or possibly a new one) for every product again. The snippet in question which only can be ran exactly once is:

    1. "duplicate" the first image on disk,
    2. resize to match new constraints,
    3. set it as the featured_image,
    4. copy over focus point settings (which is fine for most, but not all).

    Everything else (create field, add to template, ...) is done with RockMigrations.

    Why can I run this only exactly once: From this point on, the client will start uploading possibly different images to featured_image and change some of the settings where the original ones don't work out. If we ran this procedure a second time, all that work is gone and the filesystem/database is possibly in a corrupted state.

    Combining multiple fields into one with specific markup
    For another project, we had some references set up with a very sophisticated mask for inserting info like the client, the year and some more meta information. Some spec change for this forced us to combine all of this information into a single rich editor to allow for more flexibility and customization. Again, much of the content management has already been done at this point and we couldn't just ask to do it again the "new way". The snippet which only can be ran exactly once is:

    1. render the fields together as markup and set to the body field
    2. remove all fields not needed anymore (with RockMigrations)

    Again, everything else (add to template, configure field in context, ...) is done with RockMigrations.

    Why can I run this only exactly once: This one is a bit less clear as this migration might be gated with a simple if (e.g. are the fields still on the template or not). But still, it is easier to just make sure it doesn't run more than once.

    • Like 2
  5. By the way, I had a private conversation with @elabx in which we discussed the Migrations module a bit more. Since it said "deprecated in favor of RockMigrations", I originally didn't look into it further. This was a mistake! It basically does exactly what my module does. Just better. And there is absolutely no reason not to combine it with RockMigrations!

    I have forked that module and put in some early work to update it to namespaces and PHP7/8 compatibility. I might talk to @LostKobrakai to maybe take over or provide some PRs to keep it alive since there is a commercial incentive now.

    https://github.com/poljpocket/Migrations/tree/pw3-php8 note the work is just on a separate branch for now.

    • Like 4
  6. Thanks Bernhard for your post!

    55 minutes ago, bernhard said:

    That was confusing for me because it's wrong 😉

    I just typed that out here, it isn't something I copied from somewhere. And it must have been late after a long day at work 😛 Of course this isn't right and I have fixed it above!

    I love separation of concerns. I view your RockMigrations as an API wrapper which lets me control templates and fields in a very cool way and you have support for RepeaterMatrix which is something we use quite often. Also because of this, I am - just a tad - bothered that you included MagicPages and some "random" QOL features in RockMigrations which cover bunch of completely different concerns. I would offer MagicPages as an optional extension to RM and move all the QOL stuff into a new module. But at the end, I am glad, you put in all the work for RM, so I'll stop bashing you now 😄! RM is tailored to your workflow which is completely fine.

    In order for you to understand where all these ideas come from, I need to explain our normal project flow first:

    Usually, one person in my team starts the project and will setup a local installation of PW using this repo (you pointed that out already) and will use a site profile like this one. We can move this repo around in our team and work on it collaboratively. The setup will use a small set of "test data" which rarely changes or reflects the data which will comprise the end result. These will largely be placeholder media and lorem ipsum texts.

    Sometime during the early to mid-maturity of the project, our content people want/need to start working on the contents and our designers will want to insert all the images and media. The larger the project, the earlier this will happen. This is the point where we start to have two installations: one local and one on our dev server. From this point on, we cannot touch the pages and their contents anymore and right here, migrations are the way we need to "upgrade" the dev server to keep it in line with our local version. This is also why we need these small snippets which can only be executed exactly once.

    Since it is rarely the case that at this point, the project is feature-complete, my example above is quite usual: We have the template use a textarea field but due to a late spec change, we have to change this to a TinyMCE field somewhere down the road. Since there could be many pages holding final content already, we need to use the snippet to move the data around. Also, for debugging issues on the dev server, we might copy some of the final content down to the local environment by hand to have some more context.

    From there, in the very late stages of the project, the site goes live and might need some more touch-ups before the launch. This is usually the point where a migration might make it's way from docker to the dev server and then to the live site.

    As you can see, we might have three teams wanting to push the project forward at the same time. This calls for exactly what my module is about.

  7. Thanks you @elabx and @MarkE for your hints! I have looked at both these modules before we finally decided to go with RockMigrations, which for our workflow, works just as we need it. We like to create new templates, fields and their interconnections directly in the code and then let that flow from the local docker environment to the dev server and from there, on to the live site.

    Sadly, I never considered Migrations to work alongside RockMigrations, I must give that one another try sometime. Also, since it explicitly states to be deprecated in favor of RockMigrations, we didn't look into it too much at all.

    The reason we didn't go with DbMigrate was that although the feature set including even change-tracking are very strong and impressive features, it doesn't match our workflow at all. Mostly because we want to primarily work with files directly to build our structures even on the local environments.

    All of that gave rise to the idea that we just needed this small little extension to somehow manage the revisions through either the admin or as a second step CI via GitHub.

    But thanks to you, my biggest question is answered as there might not be any need on a broader spectrum since there are already some strong modules here. Of course I don't want other people to pick their poison between competing modules doing the same.

  8. Hi all. I have decided to publish one of the plugins I have been working on in my free time during the last few days but actually was motivated by our needs at work. It is currently in early beta and much of the functions might be subject to change. Before you jump at my throat and tell me that I am copying RockMigrations by @bernhard, please read on!

    System Config Versions

    This module adds an admin interface for developers to manage configuration revisions. Think of it as git for PW configurations 😊.

    It ensures that revisions only get run once. It is possible to run multiple revisions up to a certain point or even all available ones.

    revisions-list.thumb.png.834e9176b78f08b48ccf08c1e710d483.png

    Aren't you just copying RockMigrations?

    No. I made this plugin because at work, we needed more precise control over migrations and also do a lot more in migrations than adding&removing structural configurations. Often times, we use small snippets to change field data throughout the process of creating websites. And these may only be run exactly once.

    Actually, we are usually using RockMigrations alongside this plugin to use it's very impressive function set! Think of this plugin as a way to persist run status for any migrations and ensure any migration only ever gets run once.

    Simple Case Study

    The motivation behind the module is best described with a code example:

    <?php namespace ProcessWire;
    
    /** @var RockMigrations $rm */
    $rm = $modules->get('RockMigrations');
    
    $rm->createField('richtext', [
        'type' => 'FieldtypeTinyMCE',
        // ...
    ]);
    
    $rm->addFieldToTemplate('richtext', 'basic-page', 'textarea');
    
    $pagesToUpdate = $pages->find('template=basic-page');
    foreach ($pagesToUpdate as $pg) {
        $pg->setOutputFormatting(false);
        $pg->set('richtext', $pg->textarea);
        $pg->save();
    }
    
    $rm->removeFieldFromTemplate('textarea', 'basic-page');

    The goal of this revision file is to convert an existing textarea field of all `basic-page`s to a richtext field. The code snippet which maps the contents of the fields is obviously part of the revision and with bare RockMigrations, we have no control over how many times it gets run. With SystemConfigVersions, the snippet gets run exactly once.

    More Info & What's next?

    For usage instructions and details, refer to the repo's README.

    Do you guys have any first impressions or any recommendations or ideas for this project? Is there even a need for it in a broader sense?

    You can find the GitHub repo here: https://github.com/poljpocket/SystemConfigVersions

    Some ToDo's as of now:

    • URGENT: Greatly improve error handling
    • Add GitHub Actions support to automatically apply all new versions
    • Add ability to reset run status down to a certain point
    • Revise documentation below
    • Add configurability for versions folder location and file naming scheme
    • Add folder as version support (e.g. all files inside a version folder are executed and treated as one revision).
    • Like 6
  9. What was the version you updated from? We could use this to look for anything that has changed since then.

    If this is a production site, I would advise against using the dev branch. The latest master version is 229 which should be stable for production.

  10. Hi Webjack 🙂 Interesting project you have here!

    Some more pointers:

    • As other people have already pointed out, you can use the API to create new pages (let's call them subscriber) with a specific parent (let's call that subscribers) which will hold your email addresses. Don't forget to look at the EU law and whatever this entails once you go to production with this. Here, FrontendForms or also FormBuilder are good starting points, but also have a look at LoginRegisterPro. Both of them are paid modules, but you get a lot of functionality in either case. Both paid modules can also handle saving new users to the database automatically, just like FrontendForms. I am always using custom forms and submission handlers but then, you need to code a lot more.
    • Further, say you want to notify any subscriber when a new article (let's call them post) has been published, you need a hook like Pages::saved(template=post) to check if a new post has been saved and then send out your notification email to all subscribers. A hint on how to decide if a page is new can be found here.
  11. 46 minutes ago, bernhard said:

    Why is creating pages any worse than populating a repeater? I'd prefer pages I guess.

    Which, in turn, is creating pages too, albeit invisible to the user!

    I am with Bernhard on this: use pages and use them as tags. I love reusability.

    Also, considering you having multiple BnB locations, won't a repeater approach just make the user enter the same items again and again (the exact reason to go with tags instead of repeaters)?

    Another reason for tags: this ensures consistency and avoids typing mistakes, items which mean the same but are a bit different, AND most importantly: you can easily filter by amenity using the API.

  12. If you are into Docker, you can use my template to create your own local installation (although intended for development). It maps the site directory outside of the container for development. But you should easily be able to just upload your zip to the container and extract it over whatever is there already.

    For the DB, you can put the dump into the data folder and name it 'database.sql' and then run the 'dbrestore' script (for this, the composition must be running).

    poljpocket/processwire-docker: Docker installation of ProcessWire for local development (github.com)

    • Like 2
×
×
  • Create New...