Jump to content

MarkE

Members
  • Posts

    921
  • Joined

  • Last visited

  • Days Won

    10

Posts posted by MarkE

  1. 3 hours ago, Robin S said:

    hopefully fixed now

    Nope 😓"This private-user-images.githubusercontent.com page can’t be found"

    But it's fine in GitHub and the modules library 🙂

    • Like 1
  2. 11 hours ago, poljpocket said:

    I am with Bernhard on this: use pages and use them as tags. I love reusability.

    Tags are an option too @heldercervantes - use Text Tags inputfield with Page Reference field type and select "allow new pages to be created from field", but I haven't used that with a 'section' hierarchy such as that suggested by @BitPoet.

    EDIT: BTW, the CustomDependSelects module was specifically designed to operate inside repeaters. Whether you use that or the grouping method is a question of how many pages there are and your personal preference.

  3. Not sure if this is of any help, @bernhard, but I had a similar sort of need in my home-built pagebuilder module. The following code is called from the module init()

    	protected function allowableLayouts($allowedLayoutFieldName) {
    		// Create the options for motif_allowable_children/sublayouts being all the pro-forma layouts
    		$layouts = $this->pages->find("template=MotifLayout");
    		//bd($layouts, '$layouts');
    		if($layouts && $layouts->count() > 0) {// Prevent deletion of options if pages not yet loaded
    			$options = new selectableOptionArray();
    			$i = 1;
    			foreach($layouts as $layout) {
    				$option = new SelectableOption();
    				$option->set('id', $layout->id); // use page id as this should not change as components are added, changed and deleted
    				$option->set('sort', $i);  // this assigns the correct value but seems to have no effect on the display order
    				$option->set('title', $layout->title);
    				$option->set('value', $layout->path);
    				//bd($option, 'allowable child');
    				$options->add($option);
    				$i++;
    			}
    
    			$allowedLayoutField = $this->fields->get($allowedLayoutFieldName);
    			if($allowedLayoutField && $allowedLayoutField->id > 0) {
    
    				//bd($options, 'Allowable options for ' . $layoutField);
    				// ToDo - Make this conditional on there being a change (better efficiency) after being satisfied that it works OK
    				$allowedLayoutField->type->setOptions($allowedLayoutField, $options);
    			}
    
    		}
    	}

    The above is presented "as-is" but, if it is relevant, I'm sure you can adapt as required.

  4. 1 hour ago, szabesz said:

    Could you please somehow share higher-resolution videos than those above?

    Sure. Part 1 is here: https://1drv.ms/v/s!AmS0-Sk4lDz9h5tDqJwj50qq6-KEhg

    and part 2 here: https://1drv.ms/v/s!AmS0-Sk4lDz9h5tS2DJsT94F92fJmw

    If you want to play with it over the holidays 🤔, the best thing to do is to create a clean PW installation - ddev is great for doing this quickly - and I would recommend add ProcessDatabaseBackups and TracyDebugger. Start by following similar steps to the video.

    Note that, in the video, I have 'simulated' the migration installation by taking a backup before adding new fields etc. Take another backup after creating the migration, then restore the original one, changing the database name in the module config.

    And don't forget to RTM: https://metatunes.github.io/DbMigrate/help.html

    • Like 2
  5. Season's Greetings ProcessWirers! I hope you enjoy the gift of this module, but use with care...

    TLDR: This module captures changes made in the development environment so that they can be easily migrated to the live environment without needing to specify the changes or write any code. The demo below gives a brief overview.

    Want to read? Read on.

    One of the (few) problems with ProcessWire, in my opinion, is the lack of any native way of handling migrations. Given that PW is such a powerful tool capable of sophisticated and complex web-based applications, this is less than ideal. There is a solution, however, in RockMigrations which accomplishes a lot in a controllable way, provided you are happy to specify your database set-up in code rather than via the UI (albeit that the latest versions allow you to grab much of the required code from the UI). If that suits your need, great. Around the same time as the first versions of RockMigrations, I started developing my own UI-based migrations module, which I have been using with reasonable success for some time. I halted development of the module for a while as RockMigrations developed and I considered switching to that route.

    However, I decided that my module suited me better and that a real improvement could be made if it was effectively automated so that I no longer needed to specify a migration. So that is exactly what it does: after configuring the module, you add a new migration page with ‘log changes’ enabled (which includes determining what types of objects are relevant for the migration) and work on your development system. Once you have made the desired changes (and tested them!) in the development environment, you go back to the migration page where it has magically captured the objects which have changed and listed them in dependency order. You then ‘export’ the changes, which creates json files to be uploaded to the live environment (via Git or FTP etc.), where they are then ‘installed’ to re-create the changes in the live system.

    The demo below illustrates this briefly. This first demo shows the creation of a migration. The installation demo will be in the next post, because of size constraints. See post 4 for HD video.

     

    There is a very extensive manual which covers all the features of the module, not just this ‘automatic’ method.

    Available on github at https://github.com/MetaTunes/ProcessDbMigrate and in the modules library here.

    PLEASE NOTE that this is still in 'alpha'. Do not use in production without fully testing and backing up at every stage. It is quite complex so, although I have tried hard to eliminate bugs, there will inevitably be some left!

    • Like 17
    • Thanks 4
  6. 1 hour ago, bernhard said:

    I'd be even more happy if @MarkE had a look at RockMigrations and used it as common base so that we can work together on an api that does all the heavy lifting - independent from wether it is used by RockMigrations or MarkE's GUI.

    I agree that would be nice. I’m still hoping to release my module before Christmas but you know what it’s like - tracking down bugs. For example I’ve been adding Page Table Field compatibility as an alternative to RepeaterMatrix, which throws a few issues especially if the table is hosted under a “foreign” parent. I’m just about there now…..

    To your specific point @bernhard, it had already occurred to me that there might be a way of combining the best of both modules - e.g. GUI in the development environment with deployment via code; or maybe creating a RockMigrations migration from my json files. However, I don’t even want to think about this until I have released my module and had feedback from other users. 

  7. 6 hours ago, Pete said:

    All I'm thinking is otherwise you need to enter the arrival and departure time *every time* rather than have it set at one of the "parent levels" like cottage/apartment or globally.

    With my system, the times are filled in automatically from the default for the property or system, but can be changed if, for example, early access is granted. Since the system also generates email correspondence with the guests, it needs to tell them the arrival and departure times. Sure, it could be done with separate fields, as it is now.  Having thought about it a bit more, I can see that there might be some complications arising from including times in the range and you would certainly want it to be hidden in some circumstances. 
    I have another app with event booking, where date and time are essential. Using the current interface, similar to the one in my example, is a bit fiddly. 

    • Like 1
  8. You'll have to write some code for that @ShadowByte.

    This is my code for my blog tags (in Latte) which I'm sure you can adapt for your purposes (my blogs are in a repeater matix field here):

    {var $blogsPage = $page->motif_display_pageref}
    {var $blogs = $blogsPage->find("template=MotifDisplay, parent=$blogsPage")}
    {var $tags = []}
    
    {foreach $blogs as $blog}
        {foreach $blog->motif_layout_components as $component}
            {if $component->type == 'blog-post'}
                {var $blogTags = $component->motif_blog_tag->explode('title')}
    {*            {bd($blogTags, 'blogtags')}*}
                {foreach $blogTags as $blogTag}
                    {if !array_key_exists($blogTag, $tags)}
                        {do $tags[$blogTag]['name'] = $blogTag}
                        {do $tags[$blogTag]['count'] = 0}
                    {/if}
                    {do $tags[$blogTag]['count'] += 1}
                {/foreach}
            {/if}
        {/foreach}
    {/foreach}
    
    {* Sort the tags in descending order of frequency so that the most used are at the top *}
    {var $tagCount = []}
    {foreach $tags as $key => $tag }
        {do $tagCount[$key] = $tag['count']}
    {/foreach}
    {do array_multisort($tagCount, SORT_DESC, $tags)}
    
    {*<section>*}
        <ul n:foreach="$tags as $tagKey => $arr">
            <li>
                <a href="{$blogsPage->url . 'tag/' . $tagKey}">{$tagKey}</a>
            </li>
        </ul>
    {*</section>*}

     

    • Like 2
  9. 11 hours ago, bernhard said:
    13 hours ago, Rasso said:

    *automatically* generate and save migration files every time I create / update any field or template locally

    That's not how RockMigrations works. At least not at the moment. And that has several good reasons. I'm not saying that such an approach is bad. But it is very complicated to implement, it comes with a lot of (maybe unsolvable) problems and even if everything worked it has a lot of limitations by design.

    I didn't want that for RockMigrations, so I built it differently.

    Quite correct, except "even if everything worked it has a lot of limitations by design". I'm not sure what those might be. ProcessDbMigrate is working pretty well now and I will let people know when I am happy with the testing (and documentation!). Yes, it was very complicated to implement but it is now coming together nicely. I think RockMigrations is great for people who want to work that way, but I chose a different path and will happily accept criticism (and, even better, constructive suggestions) if it doesn't work as intended.

    11 hours ago, bernhard said:

    Having a gui that stores a snapshot likely means that you get a bloated json or yaml with a trillion of settings that you don't need and that you might never understand, because you didn't take the time to inspect the whole json.

    That is a risk, but I don't find it a problem with my module. It deliberately omits some unnecessary properties.

    11 hours ago, bernhard said:

    And if you have a look at line 71 you see that I'm using the Inputfield::collapsedHidden constant instead of the integer value 4 which you would get from a gui-based JSON/YAML migration file.

    Yeah - that's nice. I wonder....

    11 hours ago, bernhard said:

    Another huge drawback of a gui based migrations tool with a central place of migrations is that you limit yourself to the project. Everything you do you do for the project. With RockMigrations you can split migrations into reusable components and place migrations where they logically belong.

    ProcessDbMigrate uses multiple migrations and you can define one to export the database to another project. I am using it to install another module in new projects and plan to use  it to create a template project from a live one. I'll admit that it is early days yet and that it is probably not as flexible as RockMigrations, but I'm not convinced it is a 'huge drawback'.

    11 hours ago, bernhard said:

    Imagine you have built that blog for your project with GUI based migrations... You have another project request with a blog? Have fun, do everything again. Of course that comparison is exaggerated, but you get the point.

    With ProcessDbMigrate, you use a Page class module in the same way (without all the field definitions etc.), create a very simple 'export' migration of the required fields and templates manually (just specify the names in repeaters), sync to your target and hit 'import'.

    3 hours ago, Rasso said:

    but I think it would be enough to describe the mechanics of your module without denigrating other approaches... otherwise you're doing the same thing you've accused others of in this post: criticizing a tool that you haven't used extensively yourself

    😉Agreed, but I don't blame @bernhardfor that. I would welcome his comments when my module is released.

    3 hours ago, Rasso said:

    The json/PHP files are not meant to be touched manually anyways, so no need to understand them – although they are easy enough to understand with some experience.

    True in my case too, as the module interprets them and presents differences etc., but they have been useful in debugging 🤪

    28 minutes ago, da² said:

    Like @bernhard said this approach have limitations and drawbacks, at least:

    • You must be sure that both databases use same ID for fields and templates (and maybe more like languages...),
    • Nobody can manually change fields and templates on the target installation,
      • For both points above we may fallback on field and template name, except when field/template has been renamed.
    • Some data can not be updated so easily, like the field type that can require changes on database structure,
    • Maybe some data in "data" column ("fields" and "templates" database tables) can not be changed so easily too (I see for example some "parent_id" properties),
    • Probably more drawbacks... 🙂
    • ProcessDbMigrate uses names and paths not ids. If a name is changed it tracks that, so that it updates the target correctly. So ids do not need to be the same.
    • You cannot change fields etc. in the target if they are the subject of a live migration (unless you disable that feature), but you can after it has been fully installed and 'locked'. Of course if someone updates the live then someone else creates a new migration from an old database version that over-writes it, that is a problem!
    • So far I have not had a problem with any type of change that you can make in the UI (including parent id and template id references - these are converted to names and then converted back on installation)

    Sorry if the above sounds a bit defensive, but I just wanted to set the record straight. Of course, you can disagree once the module is released and you have used it!

    Just to emphasise a couple of points:

    1. RockMigrations is a great free module and if it's what you want then use it.
    2. ProcessDbMigrate does what I want and I thought it would be helpful to share it (but only when I am happy that it will not cause any difficulties). I have no desire ever to sell it or to position it as some sort of 'competitor', but I would be grateful if others do not knock it without just cause.
    • Like 3
    • Thanks 1
  10. 1 hour ago, ShadowByte said:

    Now edited a news item and entered the tag “Test” and saved it.
    The "Test" tag was not saved.

    There is also no setting that allows users to create their own tags, and I cannot specify tags.

    I am assuming you selected 'text tags' as the inputfield type on the field 'input' tab and also selected a parent and template for the tags. Further down that tab, you need to select 'Allow new pages to be created from field' - please note the requirements listed there.

  11. 6 hours ago, ShadowByte said:

    What I haven't understood yet is how do I get all the tags that have ever been entered? For example, to display a tag cloud like this?

    Use a Page Reference field for your tags. Then you can do pretty much what you will.

  12. On 11/27/2023 at 12:14 PM, d&#x27;Hinnisdaël said:

    being JSON-based should be a good enough base

    One thing I should mention (and would appreciate some feedback on) is that my migrations module (and I guess any that uses json files) is declarative. Even if you create a migration automatically while making your database changes, what happens is the module logs what objects have changed and exports the final state. It does not operate as a 'macro' logging each change separately. The disadvantage of this is that problems might arise where there is a 'dependency cycle' - e.g. a new field (say a page ref type) depends on a new template which includes the field. With my module, if this happens while you are logging changes it will (should 😉) warn you. If you create the migration manually and then 'sort on save' to get the items in dependency order, it will give you an error message (see video at end).

    I think there are 3 ways of dealing with this:

    1. Do more than one migration, so that all required objects are present in the database.
    2. Include an item twice in the migration - firstly without the dependency, then with the dependency after the other object has been added.
    3. Install the migration twice and hope it will sort itself out (i.e. the misssing objects will be there on the second installation).

    My module encourages (1) but permits (3) - as you will see. It does not attempt (2) - this might be feasible by detecting the cycle, and breaking it for the first item, but a bit complex to implement.

    I'd be interested in any thoughts people have on this issue (and indeed on how RockMigrations deals with it - @bernhard?).

    Meanwhile, to amuse you (??) here is a video of a big complex migration I did as part of the testing process. As you will see, there are lots of cycles, but it installed with just two clicks of the 'install' button 🙂

     

     

    • Like 2
×
×
  • Create New...