-
Posts
1,554 -
Joined
-
Last visited
-
Days Won
48
Everything posted by gebeer
-
In the past I had used the old version of RockMigrations for some simple tasks like adding fields and templates which turned out to be huge time saver. But only just recently I started to use the new version and to discover the possibilities of this tool in more depths. And what should I say, I am really amazed. Once I had grasped the core concepts and, with @bernhard's help, had learned how to use it in a more structured way and in different contexts, it turned out to to be an even bigger time saver. I hate repetitive tasks. Adding fields and templates and fields to templates and configuring labels / access rights etc for fields in the template context in PW is a pleasure when using the GUI. But it is still a repetitive task. With RockMigrations I can have definitions of these fields / templates and fields in template context in a single file and reuse that. In a matter of seconds I have my reusable base structure available to start off a new project. Adding new structure through the GUI in a staging environment and then having to replicate it on the live sytem through the GUI. Repetitive task again. Pushing a migration file to the live system and running the migration. Again in a matter of seconds. Writing migrations wherever I want is also a great feature. Be it inside site/migrate.php or inside a newly developed module, it doesn't matter. RockMigrations will find it and do its job. At the beginning I wasn't sure how to define different field types with all their different properties. RockMigrations comes with VSCode snippets that make things easy. Or you can create a field in the GUI and then just copy/paste the code for the migration from the GUI to your migration logic. So however you prefer to create your fields, RockMigrations has you covered. This post may sound like an advertisement. But I just wanted to express how happy I am after having made the decision to spend some time to learn how to work with this module. That time was definitely well spent. Big thanks to Bernhard for creating this and releasing it as a free module.
-
Hello, I am reusing a generic image field inside repeaters and need to modify the required minWidth based on the context of the image inputfield. So I have a repeater with fields "images" and "images_layout". Based on the image layout I am modifying image requirements. This is working fine for Inputfield->description or Inputfield->notes. But not for Inputfield->minWidth. The UI does not give warnings if the image size is below the minWidth. Here's my hook code: // add image info to Bilder field inside matrix item depending on layout of images // restrict images with wrong dimensions wire()->addHookBefore('InputfieldImage::render', function (HookEvent $event) { /** @var InputfieldImage $imageInput */ $imageInput = $event->object; // if ($this->user->isSuperuser()) return; $hasPage = $imageInput->hasPage; if (!$hasPage instanceof RepeaterMatrixPage) return; // this example uses repeater matrix but behaviour is same with repeaters if ($hasPage->template->name != 'repeater_content_blocks') return; $ifWrapper = $imageInput->getRootParent(); if($ifWrapper) { $layoutInputfields = $ifWrapper->find('label="Layout Bilder"'); if ($layoutInputfields->count) { $layoutInputfield = $layoutInputfields->eq(0); $layoutField = $layoutInputfield->hasField(); $layout = ($hasPage->get($layoutField->name)->value); switch ($layout) { case 'stacked': case 'stacked169': $dims = array('width' => 2560, 'height' => 1440); break; case 'parallax': $dims = array('width' => 1024, 'height' => 768); break; default: $dims = array('width' => 2560, 'height' => 1440); break; } $imageInput->notes = "Bild-Dimensionen (Breite x Höhe) in px: {$dims['width']} x {$dims['height']}"; $imageInput->minWidth = $dims['width']; // this setting is not respected in GUI $imageInput->maxWidth = $dims['width']; // this setting is not respected in GUI $imageInput->setMaxFilesize('1m'); // 1 MB // this setting is not respected in GUI $imageInput->maxReject = true; $event->object = $imageInput; } } }); When I set the minWidth via images field settings, I get a warning when image is below minWidth But this way I can only set one value for all image input fields in different contexts. When I set the minWidth through my hook to the individual image input fields, I do not get this warning. I would have to use separate image fields for each different context which I want to avoid for maintainability and DB performance reasons. The code in InputfieldImage.module returns $minWidth as empty string when there is no value set by the GUI and this seems to be the culprit. Maybe I need to hook into InputfieldImage::fileAdded to achieve what I want? But in there I also do not have the minWidth available that I set through my hook. So currently I am lost and need some pointers on how to solve this.
-
Hi, you need to construct the HTML for the link when building the array that you pass to $table->row and set $table->encodeEntities = false; In this example I am building the rows from a pro field table: /** @var TableRows $transactions */ $transactions = $this->wire('pages')->get('template=membership')->membership_transactions; /** @var MarkupAdminDataTable $table */ $table = $this->wire('modules')->get('MarkupAdminDataTable'); $table->addClass('AdminDataList table-striped'); $table->sortable = true; $table->encodeEntities = false; // important $table->headerRow($transactions->getLabels()); /** @var TableRow $t */ foreach ($transactions as $t) { $row = array(); foreach ($t->getArray() as $key => $val) { if ($key == 'id') continue; if ($key == 'member') $val = "<a class='pw-panel' href='{$val->editUrl}'>$val->fullname</a>"; // construct your link here if ($key == 'amount') $val = $val . ' USD'; if ($key == 'date') $val = date('Y-m-d h:i', $val); if ($key == 'status') $val = ($val) ? 'success' : 'failure'; $row[] = $val; } // bd($row); $table->row($row); } return $table->render(); Hope that helps.
-
Thank you for all your valued feedback so far. This is an opinionated setup, like almost all of webpack setups are in a way. I'm using this as a working starter and therefore tried to keep it as lean as possible, only concentrating on tailwind, postcss, babel. If I needed to add other libs like Alpine etc. it can be easily extended.
-
That sounds intriguing. I'm happy with my setup and the build time is pretty fast already. Seems like every couple of years a new tool is being hyped as faster, easier, better etc. I'll definitely have a look at vite but for the moment I'm going with webpack. A repo where we can look at some vite setup code with PW would be welcome. Don't think it is necessary to have a tutorial, though. I am sure there are tons out there already. But thank you.
-
Oh, I see. The feature request is about improving documentation and giving developers a way to disable stripping of comments.
-
Thanks again for the suggestion. I really don't want to spend more time for finding a workaround. I'll be waiting until we get some feedback in the feature request and a clean way to avoid stripping comments. In the meantime, I fell back to good old if statements for including the necessary HTML blocks.
-
I have setup a starter kit for developing PW sites with Tailwind CSS. You can find it at https://github.com/gebeer/tailwind-css-webpack-processwire-starter I know, I'm a bit late to the party with Tailwind. In fact I only discovered it a couple of months ago and found it to be a very interesting concept. It took quite some time to get a pure webpack/postcss/tailwind setup that features live reloading dev server, cache busting, supports babel as well as autoprefixing and works well with PW. For the cache busting part I developed a small helper module that utilizes webpack.manifest.json to always load the assets with the correct file hashes for cache busting. No more timestamps, version numbers or the like required. The little helper can be found at https://github.com/gebeer/WebpackAssets Having used it for the first time in production on a client project I have to say that I really enjoy working with Tailwind. The concept of using utility classes only really convinced me after having put it to practice. During the work on the client project, some quirks with webpack-dev-server came to surface. I was able to solve them with the help of a colleague and now it is running quite stable in both Linux and Mac environments. I am fascinated by the fast build times compared to using webpack/gulp or gulp/sass. Also the small file size of the compiled CSS is remarkable compared to other frameworks. So this will be my goto setup whenever I am free to choose the frontend framework for a project. If anyone feels like they want to give it a try, I shall be happy to get your feedback
- 9 replies
-
- 11
-
-
-
Unfortunately this is not working. And it looks ugly and feels so wrong ;-) But thank you for the suggestion.
-
This only helps in some cases. In my case I need the comment to read <!-- ko if: properties.ms == "pro" --> for knockout.js to recognise it. So the other syntax would break my JS. But thanks for pointing that out.
-
I made a feature request for documentation improvement: https://github.com/processwire/processwire-requests/issues/454
-
Hello, MarkupRegions is stripping out HTML comments by default. I just discovered that while converting a rather old project to use MarkupRegions. In that project HTML comments are required for some knockout.js code to work. And I was scratching my head for quite a while until I recognized that those comments got removed. This behaviour is nowhere being mentioned in the documentation, other than in the code itself at https://github.com/processwire/processwire/blob/3acd7709c1cfc1817579db00c2f608235bdfb1e7/wire/core/WireMarkupRegions.php#L89. And an old forum post mentions it. How can I tell MarkupRegions that my markup is required to stay exactly the same or in other words, where can I pass options ("exact" =>true) to MarkupRegions find() to change the default behaviour?
-
How to insert scripts and stylesheets from your module?
gebeer replied to hdesigns's topic in Module/Plugin Development
You might want to have a look at https://processwire.com/modules/less/ and https://processwire.com/modules/sassify/ for example implementations of parsers. I don't quite see an advantage on compiling frontend assets on the server. Except for themes that want to provide customisations without the need for coding. One module for each frontend package seems like a reasonable way to go but keeping those up to date will also be quite time consuming for the maintainer. This is where package managers come in handy. So, if you can find composer packages like BS and UIkit that you can utilize for the PW modules than that would be a big time saver. -
Unfortunately we where not on InnoDB. But the problem was not so much memory consumption but rather long execution time for saving pages. Total mem consumption after saving values to 2500 pages was around 45MB only.
-
Yes, it would for sure. I recall that Ryan was talking about making the PW API available through JS some years back. That would be exactly what we need. A RESTful API like the WP REST API in ProcessWire. This should ideally be available as an optional core module and support all Pro fields.
-
Hi, in a recent project I had to import large amounts of data that got uploaded through the frontend. To avoid lags in the frontend I pushed the actual import work to processes in the background. Since timely resources for that project were limited, I resorted to a quite simple method of starting the workers public function startSalesImportWorker() { $path = $this->config->paths->siteModules . "{$this->className}/workers/"; $command = "php {$path}salesimportworker.php"; $outputFile = "{$path}/logs/workerlogs.txt"; $this->workerPid = (int) shell_exec(sprintf("%s > $outputFile 2>&1 & echo $!", "$command")); if ($this->workerPid) return $this->workerPid; return false; } Here's the worker code namespace ProcessWire; use SlashTrace\SlashTrace; use SlashTrace\EventHandler\DebugHandler; include(__DIR__ . "/../vendor/autoload.php"); include(__DIR__ . "/../../../../index.php"); ini_set('display_errors', false); error_reporting(E_WARNING | E_ERROR); $slashtrace = new SlashTrace(); $slashtrace->addHandler(new DebugHandler()); // $slashtrace->register(); $lockfile = __DIR__ . "/locks/lock-" . getmypid(); // restart when fail or done function workerShutdown($args) { // release lockfile if (file_exists($args['lockfile'])) unlink($args['lockfile']); echo PHP_EOL . "Restarting...\n"; $outputFile = __DIR__ . '/logs/workerlogs.txt'; $command = PHP_BINARY . " " . $args['command']; sleep(1); // execute worker again exec(sprintf("%s > $outputFile 2>&1 & echo $!", "$command")); } register_shutdown_function('ProcessWire\workerShutdown', array('lockfile' => $lockfile, 'command' =>$argv[0])); // wait for other workers to finish while (wire('files')->find(__DIR__ . "/locks/")) { sleep(5); } // create lockfile file_put_contents($lockfile, $lockfile); try { // ini_set('max_execution_time', 300); //300 seconds = 5 minutes wire('users')->setCurrentUser(wire('users')->get("admin")); echo "starting import: " . date('Y-m-d H:i:s') . PHP_EOL; /** @var \ProcessWire\DataImport $mod */ $mod = wire('modules')->get("DataImport"); $mod->importSales(); echo PHP_EOL . "Import finished: " . date('Y-m-d H:i:s'); // run only for 1 round, then start a new process: prevent memory issues die; } catch (\Exception $e) { $slashtrace->handleException($e); die; } I got the idea for restarting the same worker from https://www.algotech.solutions/blog/php/easy-way-to-keep-background-php-jobs-alive/ Note that I am using https://github.com/slashtrace/slashtrace for error handling since it gives nice CLI output. And I couldn't figure out how to utilize the native PW Debug class for that since I needed stack traces. Overall this solution worked quite well. But it doesn't give any control over the worker processes. At least there was no time to implement. Only after having finished the project, I discovered https://symfony.com/doc/current/components/process.html which seems to have everything you need to start and monitor background processes. So next time the need arises I will definitely give it a try. I'm imagining a Process module that lets you monitor/stop background workers and a generic module to kick them off. How do you handle background processes with PW?
-
- 4
-
-
.I guess the reason why @wbmnfktr resorted to WordPress is that you have the JSON feeds readily available without any coding. And I often thought that it would be really awesome if we had something like this in PW. With all the modules mentioned AppAPI GraphQl etc you still need to write code to get the desired output at the desired endpoint. WordPress handles this out of the box. And this makes it attractive. PW would definitely benefit and get more attention if there was a module that automatically creates RESTful API endpoints following the page tree structure.
-
Hi, just wanted to share my experience working with larger data sets in PW. For a recent project I had to import rather large data sets (between 200 and 5000 rows) from Excel sheets. The import files got processed in chunks and I had to save only one value to one float field on a page. There were about 15.000 pages of that type in the system and another 1800 pages of a different type. The process of saving each page got really slow when looping through hundreds or thousands of pages. Inside the loop I retrieved every page with a $pages->get() call. This was really fast. But saving got very slow. It took about 4 minutes to process 2500 pages on my dev docker machine and about 2 minutes on the live server with 16 virtual cores and 80GB of RAM. There was one hook running on change of the page field that I saved the values to that did a simple calculation and saved the result to a different float field on the same page. And I guess that was one reason for slowing the process down. After changing the logic and doing the calculation in the import loop, things got a little better. But not much. So I wonder if PW is just not designed to handle large amounts of page saves in an efficient way? I would have thought otherwise. What I ended up doing is writing the values directly to the field in the DB with a simple method public function dbSetData($field, $id, $value) { $db = $this->wire->database; $statement = "UPDATE `{$field}` SET `data` = {$value} WHERE `pages_id` = {$id};"; $query = $db->prepare($statement); if ($db->execute($query)) return true; return false; } and also getting the required value for the simple calculation directly from the DB public function dbGetData($field, $id) { $db = $this->wire->database; $statement = "SELECT `data` FROM `{$field}` WHERE `pages_id` = {$id};"; /** @var WireDatabasePDOStatement $query */ $query = $db->prepare($statement); if ($db->execute($query)) { $resultStr = $query->fetchColumn(); if (is_string($resultStr)) $result = $resultStr + 0; return $result; } else { return false; } } And that drastically dropped execution time to about 8 seconds for 2500 rows compared to 2 minutes with $pages->get() and $pages->save(). I guess if I need to setup a scenario like this again, I will not use pages for storing those values but rather use a Pro Fields Table field or write directly to the DB.
-
module ImageOptim — automatic image optimization
gebeer replied to d'Hinnisdaël's topic in Modules/Plugins
Wow, that was fast. Thank you! -
module ImageOptim — automatic image optimization
gebeer replied to d'Hinnisdaël's topic in Modules/Plugins
Hi @d'Hinnisdaël do you still support this module, is it working with latest PW master? -
With coded migrations you alter the structure of the application and it is great that your module provides this. But I think we should respect that not everyone wants to do this by code. Some would rather like to use the admin UI. And this is where the recorder comes in. Any changes are reflected in a declarative way. Even coded migrations would be reflected there. What I was trying to say is that the complete state of the application should be tracked in a declarative manner. How you get to that state, be it through coded migrations or through adding stuff through the UI, should be secondary and left up to the developer. ? Please don't go just yet. I'm sure we can all benefit from your input.
-
Have a look at Bernhard's video in the first post of this thread which presents a proof of concept YAML recorder. This records all fields/templates to a declarative YAML file once you make changes through the admin UI to either a field or template. So you will always get the current state of all fields/templates as a version controllable file. That could be imported back once someone writes the logic for the import process. That YAML recorder really is a great first step. But fields/templates config alone do not represent the full state of a site's structure. We'd need to also record the state of permissions, roles and modules and later create/restore them on import. @bernhard's RockMigrations module already has methods createPermission and createRole, so does the PW API. Modules can be installed/removed through the PW modules API. So importing the recorded changes should be possible. The recorder and importer are the key features needed for version controlling application structure. Adding fields/templates/permissions/roles/modules through code like with RockMigrations would be an added benefit for developers who don't like using the admin UI.
-
Resolved - MarkupAdminDataTable syntax issue
gebeer replied to opalepatrick's topic in General Support
When you build a table row, the attributes for that row need to go in an options array. $data = [$club->title, $club->id]; $options = array(); $options['class'] = "class1 class2"; // class (string): specify one or more class names to apply to the <tr> $options['attr'] = array('cid' => $club->id); // attrs (array): array of attr => value for attributes to add to the <tr> $table->row($data, $options); Hope this helps. -
You need to give them this mantra: "pull - merge - push" and let them recite it 100 times a day at least ? It only knows that it needs to deploy on push. So it won't deploy when someone merges stuff on the server. From that perspective it is pretty fail safe. Give it a try, think you'll love it.