Leaderboard
Popular Content
Showing content with the highest reputation on 06/29/2021 in all areas
-
A companion Module for RockMigrations to gather and execute Migrations in one place. !Alpha RELEASE! Hi there, I just wanted to share my take on Migrations and how I am using RockMigrations to make my deployments somehow manageable. First of all, a big shoutout to @bernhard for his efforts and accomplishments for our beloved CMS. We recently had a video-call and talked about our daily problems and how to handle them, with a focus on Migrations. As of now you might think 'Gosh, not another Migrationsthingy' but wait. Wouldnt it be nice to have a nice overview or some kind of list regarding Migrations and stuff and one place to manage them? Speaking for my own, yes I would like to have something like this. So yeah, here we are with a Companion Module for RockMigrations. Product Requirements I want to manage Migrations on the Application level (done) I want to have one place in my FileSystem to place my Migrations (done) I want to see in which state my Migrations are (done) I want to have an execution order, execute migrations one by one (done) I want to trigger Migrations via CLI (open) I want to group multiple Migrations into One Migration to create Build Versions (open) I want to Rollback Migrations (open) I want to create a deliverable build out of migrations (open) I want to track changes to templates and fields and create migrations accordingly (open) I want to manage Migrations on the Module level (open) Module Requirements ProcessWire 3.0.178 RockMigrations latest PHP 7.4 Composer Current release: v1.0.2Alpha How does it work? On installation the Module will create a new Database Table, here we will hold information about migrations states. Your Migrations are empty on first installation, so just click on You can now choose which Type of Migration you want to create, be it Templates, Fields or Pages and a according Action. Im still working on this function, so yeah its a lil bit confusing right now. The philosophy here is, every Type of Migration could be easily identified and we can Migrate on a very sepcific and granular base. As you can see in my Screenshot, I am using migrations to build an entire App from Migrations. After creating the new Migration File, switch over to your IDE and find a timestamped .php file inside modules/FlowtiCore/migrations. This file just returns a simple Array. This is all the Info RockMigrations needs to do his thing while we keep track of the execution. Easy. Values Arrays have to follow PW and RockMigration field naming Conventions. To make a Migration Executable set 'boilerplate' to false. After creation of your first Migrations your overview should be filled so lets go. We have to migrate our files in a specific order just to make sure we can create page_references and Parent/Child connections. To achieve this, Migrations are timestamped on creation. The older the higher the priority. To enforce execution order, every migration needs to have his predecessor executed and installed. How could this help my workflow? My workflow is based on Git, Webhooks and Git Events. So whenever I merge 'staging' into 'master', a build will be created and a deliverable should be pushed to a Server via SSH. The Problem with ProcessWire is the lack of support for such Workflows and Toolchains due to its User-Friendly Admin Backend which is fine for a simple website but not suitable long-term if working in a multi-tenant environment or with more developers in a dev-staging-test-production setup. My Goal is to provide methods and ideas to support such workflows but also support a User-Friendly Interface to work with migrations. I really hope it could be of use for someone. Installation I will add the Module to the Directoy once it reached a stable state. But you can get current version at GitHub https://github.com/Luis85/FlowtiCore Just clone it to your Modules Directory. The Module will create a new Database-Table on creation. The Default Name will be 'flowti_core_migrations'. To change this just edit const DATABASE_TABLE = 'flowti_core_migrations'; Inside the Module Class. Thats it, from there on just create new Migration Files, edit them and execute them.3 points
-
I've got some core updates in progress but nothing major committed yet, so we'll save that for next week. But I wanted to briefly tell you about a module I'm working on here, which I plan to eventually put in core. I built it for some needs I've had here, but it will first be released in ProDrafts (so ProDrafts may start using its API), and in ProDevTools too, since it is also a developer tool. It's a snapshot versioning system for pages, but more of an API tool at this stage, kind of like the $pages API var, but for snapshots/versions of pages. It lets you create a snapshot of any page, including its fields and files (including core fields, ProFields, 3rd party fields, repeaters, nested repeaters, table fields with a million rows, etc.). The snapshots can be restored at any later time. Snapshots may also be restored to a different page, such as a new or existing page. In this manner, a module like ProDrafts may be able to use it to manage draft versions even better than it currently can, or at least that's the plan. Though since it's an API tool, my hope is that when/if it winds up in the core, others may be able to use it for stuff we've not thought of yet too. The module is a little different than my previous attempts (like what's in ProDrafts now) in that it does most of its work directly at the database level (and file system level, where applicable), which enables it work without needing to know much about the Fieldtype. Currently it is fully functional, but I have a few less common cases to work out before it's ready for release. Once installed, every page includes a Snapshots fieldset, which can be located in the settings tab of the page editor, or a separate tab in the page editor (configurable). When installed, every page has one, so long as the user has the appropriate permissions. There's a screenshot of what it looks like in the page editor below, but it is very simple at this stage, basically just enough for testing purposes. Every snapshot has a name that is unique on the page. You can overwrite a snapshot just by saving a snapshot with the same name on the same page. So you could for instance have a hook that creates a snapshot named "undo" right before a page is saved (in a Pages::saveReady hook), and in that way a module could add a basic undo capability to the page editor. This is just a simple example though. Thanks for reading, have a great weekend!2 points
-
I had your workflow in mind when creating the Module ? But needed a baseline to get it going. I will update the Product Requirements, just hadnt tought enough yet how to merge the different approaches. My inital thought was to let the user define Modules he want to manage with the module, build an array of Modules and Folders according to the user configuration and scan affected folders. Which will result in a structure like /site/migrations /site/modules/XYZ/migrations On a sidenote, I also want to add Template and Field changes done via the ProcessWire Backend under Migration Control just to support a easy "click-and-collect" Workflow. For example you are building some Templates via UI to test things out on your local machine, Migrations will be added automagically and after a Push to your GitRepo you will have your executable Migrations in place.2 points
-
Bernhard, thanks for your detailed and well tought answer, really appreciate it. I definitly see where you are coming from. We are working both on very different topics, you are working on a multitude of projects and you need to keep track of your modules and the various different combinations and connections between them. I am glad I dont have to deal with this ? My usecase is a bit different as I am working on a single Installation with a well defined set of Modules and the same functions for every single user/customer. Your approach to deal with Migrations on a Module Base makes sense, yes. Thing is, I really dont like to mix Business Logic, Domain specific Logic and Application Logic hence my file based approach to solve my first need. My second need would be to put Modules under Migrations Control like you do. My primary thought was to create a /site/migrations/ folder and let the User decide which Module he wants to set under Migrations Control and act accordingly. So moving the Module bound migrations folder into the site scope. I dunno yet ? but I will definitly support a usecase as you have in the future, just because I would need that feature, too ? Regarding the granularity of Migrations, yes, it could end in a mess. My reasoning to seperate the different types of Migrations is to force the developer to atleast think about execution order and implications in his database change. You are forced to discipline yourself so to say, to avoid a mess ? #me #myself and #i2 points
-
PS: Thanks for your interest and efforts in migrations - always very welcome to see someone is working on solutions ? ?2 points
-
Hi @LuisM thx for the great writeup - it's always hard to understand complex topics! To be honest I'm not sure if I like what I see ? On the one hand sometimes I wished I had some kind of overview/list of migrations and some place to manage them, but on the other hand I'm using migrations in a totally different way and your approach feels like a step back in many ways. No offense here - both approaches have pros and cons and it might simply be a matter of preference ? The problems with that approach are the following (for me): One central place for migrations That might sound good and it might be superior from a technical aspect, but for me and my daily work it's not. I'm building my sites with many different modules and my main goal is productivity and quality. Almost anything that I do is done via modules nowadays. These modules do often need to create the necessary fields and templates, and they do that using RockMigrations. It's great. I get a module that is reusable across projects and it's quite easy to develop and to create all the necessary database stuff. And it's easy to push new versions to production and after a modules refresh I have all that I need. Having a central place for migrations locks all the work that I do in one single (and not reusable) project. Or am I misunderstanding your concept? So my solution would be: Create different modules that do different things (eg a newsletter module, a sitemap module, a slider module) and all those modules ship with their own migrations. They are separate pieces and can be used across several projects. Then on the main project I'd have another module (sometimes I simply call that Site.module.php) and that one has migrations to install other modules or do stuff that does not fit into a reusable module. Such a migration could look like this: $rm->installModule('Slider'); $rm->installModule('Sitemap'); // create home page // set page title // create imprint page // etc etc I know that this approach has other drawbacks, but for now it works quite good for me ? Template/Field/Page migrations I think I also don't like the approach of splitting every migration into a dedicated field/template/page migration. Maybe I'm misunderstanding you again, but usually such migrations are related. Eg you want to add a field to a template so you need to create that field and then you need to add that field to your template. Having those changes in seperate files leads to a mess of migration files that are very hard to understand afterwards. You lose the big picture imho. My approach is to place all that in the migrate() method and you'll end up with a static "snapshot" of your system that you'll easily and instantly understand when you look at it. That get's even better when you use GIT: The drawback is of course that it get's harder to revert such changes. Backup/Restore is one technique or custom downgrade migrations could be another concept. What I sometimes do is adding a "cleanup()" method on top of the migrations that removes fields that I do not need any more. So if I needed to remove that reminder field, I'd do the following: Again: Reverting would be a challenge, but I've never needed any reverts until now and backup/restore has served me well enough ? Hope the post was helpful nevertheless ?2 points
-
Please open the dev tools of your browser and take a look if there are errors in the console or if there are 404 errors in the network panel. I think that the paths might be wrong. In dev tools network tab you will see from where the browser tries to load the assets. Also the rel="stylesheet" in your script tag absolutely makes no sense, please remove it.2 points
-
2 points
-
Technically files on any page are available to link to on any other page. You use the "Select Page" field to choose the page that holds the file you want to link to and then select a file from that page. But to make it quicker to select files from one or more special "library" pages there is this handy module: https://processwire.com/modules/media-library/2 points
-
To be honest... what you @LuisM and @bernhard built... it's somehow awesome but either way somehow insane. I love it... even though I almost will never use it by myself... as it's either too complex or way out of my comfort zone. (for now) Yet... I really appreciate what you @bernhard do for the community with all your modules and ideas and you @LuisM who builds things on top of it. Love it!2 points
-
Many thanks for your post and help! All has been revealed, and some facepalm moments... (1) had to change $config to wire('config') (?♂️) (2) had to set Tracy config to show in dialog (?♂️) (3) Yep, i'll just tell people up front if they need to put something in front of it, they need to move the widget..1 point
-
Double-check that the path to your JS file is correct, because that conditional will only append the script markup if is_file() is true. With some debugging (see item 2) hopefully you can get it working. I just tested here again and it works for me. The label of that config setting is perhaps not perfect - I'll update it. The setting doesn't actually ensure that Tracy is displayed, rather it only disables the default CSS in the module that would otherwise hide the Tracy debug bar. <?php if(!$hcd->show_tracy): ?> #tracy-debug { display:none !important; } <?php endif; ?> So you first need to have Tracy set to display the debug bar in the PW admin generally and then if you have that HCD setting checked it won't be hidden in the dialog. There's not much I can do about that because it's just how the CKEditor Widget plugin works. In such a situation I tend to use the arrow keys to move the cursor to the start of the field, or type the text underneath and then drag the widget to its new position.1 point
-
1 point
-
Yeah. That makes sense ? Just thought maybe your approach and mine could somehow be merged? Eg maybe migrations could not only live in /site/migrations but also in /site/modules/mymodule1/migrations and /mymodule2/migrations ? So each module could have a separate view of what you already have for the central migrations? Just brainstorming ?1 point
-
Thanks a lot!!! It works ... after a while of thinking and testing! Here is the code of my ready.php (located in "site" directory). The module TextformatterProcessImages has to be installed. More Details about the module are here https://github.com/Toutouwai/TextformatterProcessImages/tree/7068b7864dd9e78c9cc4a76d6790957a41198dda <?php $wire->addHookAfter('TextformatterProcessImages::processImg', function(HookEvent $event) { // The Simple HTML DOM node for the <img> tag /** @var \simple_html_dom_node $img */ $img = $event->arguments(0); // The Pageimage in the <img> src, if any (will be null for external images) /** @var Pageimage $pageimage */ $pageimage = $event->arguments(1); // The Page object in case you need it /** @var Page $page */ $page = $event->arguments(2); // The Field object in case you need it /** @var Field $field */ $field = $event->arguments(3); // Only for images that have a src corresponding to a PW Pageimage if($pageimage) { // The original full size image $imgoriginal = $pageimage; // Small image with a width of 390 px $imgsmall = $pageimage->width(390); // Set the code for the lightbox link and image $img->outertext = "<a data-lightbox='image-1' title='zoom in' href='{$imgoriginal->url}'><img alt='{$pageimage->description}' src='{$imgsmall->url}' width='{$imgsmall->width}'></a>"; } }); ?> THANKS!!!1 point
-
Sometimes you can't see the forest for the trees, thanks! ? Thanks, also for module you linked to, that looks great. You seem to crop up the minute I ask a question here; I most owe you a fair few beers by now. ?1 point
-
You will get the URL to the image variation (small image) from the src attribute of the <img> tag. When you have that variation URL you can use it with PagefilesManager::getFile() to get the Pageimage (see here), and from the Pageimage you can get the original URL by $pageimage->url1 point
-
yep - since the settings definitions are defined in json or php files they can be copied from site to site and then setup in the admin (simply create a page under /admin/ and then set the process as SettingsFactory, then type in the path to the settings definition file). In terms of exporting actual settings, like the values, I have thought about that as well, as it would be nice to be able to back up the settings for any given single settings configuration.1 point
-
Process Images A basic, proof-of-concept Textformatter module for ProcessWire. When the Textformatter is applied to a rich text field it uses Simple HTML DOM to find <img> tags in the field value and passes each img node through a hookable TextformatterProcessImages::processImg() method. This is a very simple module that doesn't have any configurable settings and doesn't do anything to the field value unless you hook the TextformatterProcessImages::processImg() method. Hook example When added to /site/ready.php the hook below will replace any Pageimages in a rich text field with a 250px square variation and wrap the <img> tag in a link to the original full-size image. For help with Simple HTML DOM refer to its documentation. $wire->addHookAfter('TextformatterProcessImages::processImg', function(HookEvent $event) { // The Simple HTML DOM node for the <img> tag /** @var \simple_html_dom_node $img */ $img = $event->arguments(0); // The Pageimage in the <img> src, if any (will be null for external images) /** @var Pageimage $pageimage */ $pageimage = $event->arguments(1); // The Page object in case you need it /** @var Page $page */ $page = $event->arguments(2); // The Field object in case you need it /** @var Field $field */ $field = $event->arguments(3); // Only for images that have a src corresponding to a PW Pageimage if($pageimage) { // Set the src to a 250x250 variation $img->src = $pageimage->size(250,250)->url; // Wrap the img in a lightbox link to the original $img->outertext = "<a class='lightboxclass' href='{$pageimage->url}'>{$img->outertext}</a>"; } }); GitHub: https://github.com/Toutouwai/TextformatterProcessImages Modules directory: https://processwire.com/modules/textformatter-process-images/1 point
-
@mlfct, the issue mentioned above should now be fixed in SearchEngine version 0.30.2. Thanks for letting me know about this and sorry for taking so long to solve it ?1 point
-
I'm posting this as an update to an earlier post created by @Hari KT: https://processwire.com/talk/topic/4958-composer-support-for-processwire/. Though that approach still (kind of) works (as does the one detailed in https://github.com/wireframe-framework/processwire-composer-installer), thanks to @d'Hinnisdaël there's now a better alternative: the official composer/installers project ? An example repository implementing the things detailed in this post: GitHub repository: https://github.com/teppokoivula/HelloWorld Packagist entry: https://packagist.org/packages/teppokoivula/hello-world As a module author, how do I make my module installable via Composer? 1) Add a composer.json file to your module's directory. Here's an example: { "name": "vendor-name/module-name", "type": "processwire-module", "license": "MIT", "extra": { "installer-name": "ModuleName" }, "require": { "composer/installers": "~1.0" } } The composer.json file explained: "name" consists of two parts: your vendor (author) name, and the name of the package (module). These can (but don't have to) be the same as your GitHub or BitBucket user and repository names. Please note that this value should be all lowercase! That's the syntax expected by both Packagist and Composer. "type" should be "processwire-module". You may have seen "pw-module" used by other packages; that's the value used by third party installers, you don't need to worry about that now. "license" should specify the license your module is published under. See Composer help for expected syntax. It's technically fine to leave this out, but it's always a good idea to let users know how they're allowed to use your code. "installer-name" under "extra" should specify the expected directory name for your module. Usually this is the same as your module's name. If you leave this out, the package part of the "name" value will be used instead (which may be just fine, though I'd recommend always filling in this value). "require" includes Composer dependencies of your module. The key part here is "composer/installers" — without this Composer won't know that your module requires said installer, and it may not be installable at all, so be sure to add this row. 2) Submit your project to Packagist: https://packagist.org/packages/submit. You will need an account for this step. It's free and very easy to register, and you can automatically connect it with your GitHub account. Connecting with GitHub also makes it easier to auto-update package versions from GitHub repository. 3) Recommended but not absolutely necessary: add tags to your module's Git repository. It's recommended that when you push a new version of your module to GitHub or BitBucket, you also add a matching tag: if you push version 0.0.3 (or version "3", following the old school ProcessWire version number format), you should also add tag 0.0.3 (or "v0.0.3" if you want to be verbose) to GitHub/BitBucket. (This step is not strictly speaking necessary, but it does make things easier for users installing your module, and makes much easier to track which version of the module is currently installed via Composer. It requires additional step when publishing a new version of the module, but please consider doing it anyway!) 4) Also recommended but not absolutely necessary: configure Packagist to auto-update based on GitHub/BitBucket. Follow the instructions here: https://packagist.org/about#how-to-update-packages. This step ensures that once you push a new version of your module, Packagist automatically updates stored information without you logging in and hitting the "update" button manually. (This step may not be necessary if you've already allowed Packagist access to your GitHub account.) ... and that's it. Congratulations, your module is now installable via Composer! As a module user, how do install a module via Composer? Go to your site's root directory and type on the command-line "composer install vendor-name/module-name". You can look up the correct details from Packagist, or the module author may have included them in the support forum thread. Obviously this only works for those modules that have implemented Composer installer support as outlined in this tutorial. Note: if you're using a "non-standard" directory structure for ProcessWire — you've moved the root of the project outside the public web root, or something along those lines — check out the custom install paths part of the composer/installers README. The "installer-paths" setting allows you to manually specify a custom install path for the "processwire-module" package type.1 point
-
if($this->showForBackend) { $this->addHookBefore('ProcessPageList::execute', function ($event) { wire('config')->styles->add($this->config->urls->PageHitCounter . 'PageHitCounter.min.css'); if($this->wire('input')->get('mode') != 'select') { $this->addHookAfter('ProcessPageListRender::getPageLabel', $this, 'addPageListHitCounter'); } }); }1 point
-
Hi @gebeer Just stumbled across this (I know my reply is a little late,) but it is totally possible to prevent your rest API endpoints from starting sessions by using the $config->sessionAllow; variable. If you define it to be a function that returns bool true or false, then it will be evaluated and the return value determines if the Session class constructor is allowed to start a new session. There's an example of it in the default wire/config.php file at line 245. Reproduced here... $config->sessionAllow = function($session) { // if there is a session cookie, a session is likely already in use so keep it going if($session->hasCookie()) return true; // if URL is an admin URL, allow session if(strpos($_SERVER['REQUEST_URI'], $session->config->urls->admin) === 0) return true; // otherwise disallow session return false; }; You just need to rewrite the function so it returns false for your API endpoint path. Make that change and add it to your site/config.php file, and I think that anything hitting your API endpoint directly will not have a session created for the connection. Hope that helps!1 point
-
Oh dear! Thanks @BitPoet that's the solution ? I actually just missed to update the version in info.json…1 point
-
RestApi.info.json in the github repo still says 0.0.3. Did you forget to sync your local changes to github?1 point
-
The problem with your approach is, that it only requests the data once and ProcessWire returns all pages instead of those who match the query string. This could be a problem on very dynamic sites, where the content changes often. Here is my solution which solves this: Modify the standard search.php and add if ($config->ajax) { header("Content-type: application/json"); // Set header to JSON echo $matches->toJSON(); // Output the results as JSON via the toJSON function } so the whole file reads <?php namespace ProcessWire; // look for a GET variable named 'q' and sanitize it $q = $sanitizer->text($input->get->q); // did $q have anything in it? if ($q) { // Send our sanitized query 'q' variable to the whitelist where it will be // picked up and echoed in the search box by _main.php file. Now we could just use // another variable initialized in _init.php for this, but it's a best practice // to use this whitelist since it can be read by other modules. That becomes // valuable when it comes to things like pagination. $input->whitelist('q', $q); // Sanitize for placement within a selector string. This is important for any // values that you plan to bundle in a selector string like we are doing here. $q = $sanitizer->selectorValue($q); // Search the title and body fields for our query text. // Limit the results to 50 pages. $selector = "title|body%=$q, limit=50"; // If user has access to admin pages, lets exclude them from the search results. // Note that 2 is the ID of the admin page, so this excludes all results that have // that page as one of the parents/ancestors. This isn't necessary if the user // doesn't have access to view admin pages. So it's not technically necessary to // have this here, but we thought it might be a good way to introduce has_parent. if ($user->isLoggedin()) $selector .= ", has_parent!=2"; // Find pages that match the selector $matches = $pages->find($selector); $cnt = $matches->count; // did we find any matches? if ($cnt) { // yes we did: output a headline indicating how many were found. // note how we handle singular vs. plural for multi-language, with the _n() function $content = "<h2>" . sprintf(_n('Found %d page', 'Found %d pages', $cnt), $cnt) . "</h2>"; // we'll use our renderNav function (in _func.php) to render the navigation $content .= renderNav($matches); } else { // we didn't find any $content = "<h2>" . __('Sorry, no results were found.') . "</h2>"; } if ($config->ajax) { header("Content-type: application/json"); // Set header to JSON echo $matches->toJSON(); // Output the results as JSON via the toJSON function } } else { // no search terms provided $content = "<h2>" . __('Please enter a search term in the search box (upper right corner)') . "</h2>"; } Then call typeahead with the following options: $.typeahead({ input: '#q', order: 'desc', hint: false, minLength: 3, //cache: false, accent: true, display: ['title'], // Search objects by the title-key backdropOnFocus: true, dynamic: true, backdrop: { "opacity": 1, "background-color": "#fff" }, href: "{{url}}", emptyTemplate: "No results for {{query}}", searchOnFocus: true, cancelButton: false, debug: true, source: { //url: actionURL // Ajax request to get JSON from the action url ajax: { method: "GET", url: actionURL, data: { q: '{{query}}' }, } }, callback: { onHideLayout: function (node, query) { $('#searchform').hide(); console.log('hide search'); } } }); The important parts are "dynamic:true" and the "source" configuration so the query string is beeing sent. Now you have a nice AJAX search. EDIT: If you also want to find the query string in other fields than the title make sure you add filter: false to the config of typeahead.1 point