Leaderboard
Popular Content
Showing content with the highest reputation on 02/23/2026 in all areas
-
Hi everyone, I'd like to introduce Banana Imagine — a ProcessWire module that brings high-quality AI image generation directly into your image fields using the Google Nano Banana API. Key features: Clean generation interface right below supported image fields Generate 1–4 variations at once Smart subtle prompt variations for better batch diversity Selected images saved natively to the page (with clean naming: [pageID]-[timestamp].jpg) Simple configuration: API key + choose which image fields to enable GitHub: https://github.com/mxmsmnv/BananaImagine This module is a fork / spiritual successor to my previous module GrokImagine (xAI/Grok-based): https://processwire.com/talk/topic/31744-grokimagine-ai-image-generation-via-xai/ Installation & usage instructions are in the README: Just drop the folder into /site/modules/, install, add your Google AI API key (billing required for image gen), select fields → you're good to go. Screenshots: Feedback, bug reports, and feature ideas are very welcome! Thanks, Maxim2 points
-
RockPageBuilder is free: https://processwire.com/modules/rock-page-builder/ But yeah, it's for sure not as performant as a single body field... You can cache the output via Template- or MarkupCache (free) or using ProCache (not free, but great)2 points
-
2 points
-
Hi everyone! I've built AiWire — a module that connects ProcessWire to AI providers (Anthropic, OpenAI, Google, xAI, OpenRouter). GitHub: https://github.com/mxmsmnv/AiWire What it does $ai = $modules->get('AiWire'); // Simple call echo $ai->chat('What is ProcessWire?'); // Generate multiple fields at once $ai->generate($page, [ ['field' => 'ai_overview', 'prompt' => "Write overview..."], ['field' => 'ai_seo_meta', 'prompt' => "Generate meta..."], ], ['cache' => 'W']); // Auto-fallback if provider fails $result = $ai->askWithFallback('Translate this...', [ 'provider' => 'anthropic', 'fallbackProviders' => ['openai', 'google'], ]); Main features Multiple API keys per provider with auto-failover Connection testing from admin Interactive Test Chat with parameter controls File cache with TTL (day/week/month/year) Save AI responses to page fields Multi-turn conversations Full docs with 25 real-world examples Requirements PHP 8.1+, ProcessWire 3.0.210+, cURL, and at least one API key. If you try it out, I'd love to hear your feedback — whether the API makes sense, if the docs are clear, or if you run into any issues. Thanks! 🙏2 points
-
I think you don't have to worry. What you did is factually what ProFields Repeater Matrix does. It's a repeater which collectively contains all the different fields and selectively only displays whatever should show for the current type selected. We're using Pro Fields in practically every project and let me tell you, it gets complex and uses a lot of fields fast. For many cases, by design, most of these fields stay empty just like in your case. And this never is a performance problem at all. ProcessWire is extremely scalable and will smartly load fields when they are needed.2 points
-
Media Hub update The Media Hub now has a folder view Folders is the default name for the view. It can be renamed to Gallery, Collections, Assets, Media, etc. Media assets can exist in one or more Folders so essentially they operate like Tags Clicking a Folder name will filter the main MediaHub by items within that Folder (and optionally sub folders) On an Asset detail view, you can quickly add and remove an association to a folder Support for Folder nesting, renaming etc is working2 points
-
There aren't any backreferences in the regex, so it's probably just a question of table+field size, but you can try upping regexp_time_limit. I wouldn't go into the three digits though if you have significant concurrent requests. Regular expressions are expensive, and tweaking limits only gets you so far before overall DB performance degrades too much. I was scratching my head a bit why a ~= would invoke a an rlike, but it's probably because AMA is only three letters, and minimum word length for MyISAM defaults to 4 (parameter ft_min_word_len). You may want to lower that to two (which means rebuilding the fulltext indexes), then PW will use native fulltext search instead of the regex workaround for short words.2 points
-
Hey, I've started to create a Media Hub for Processwire. [Edit...newest updates to UI are in later posts] Screenshots attached. Obviously a few UI improvements are needed 🙂 One of my clients requested a centralised media manager. I thought it'd be fun to give it a go and learn some stuff. I know that with a self-built Module, it'll always be maintained, and I have an active interest in evolving it. Shout out to @markus-th who just announced he is doing similar with WireMedia.1 point
-
1 point
-
Fantastic. You've already won my heart. 🤩 P.S.: @Peter Knight, this raises another question for me: have you also planned a ‘delete protection’ feature? When an image is deleted from the Media Hub, is it first checked to see whether the image is used on any pages? If so, deletion is not possible and there is an option to list all the pages where the image is used?1 point
-
1 point
-
That's a bit of the AI bragging, still missing field/template creation to put an example. I focused first on page CRUD. This looks amazing 😮1 point
-
Hi fellow devs, this is a somewhat different post, a little essay. Take it with a grain of salt and some humor. Maybe some of you share similar experience. I don't really mean to poop on a certain group with certain preferences, but then, that's what I'm doing here. I needed to write it to load off some frustration. No offense intended. Good Sunday read :-) React Is NPC Technology Have you ever really looked at React code? Not the tutorial. Not the "Hello World." An actual production component from an actual codebase someone is actually proud of? Because the first time I did, I thought there'd been a mistake. A failed merge. HTML bleeding into JavaScript, strings that weren't strings, logic and markup performing some kind of violation you'd normally catch in code review before it got anywhere near main. "Fix this," I thought. "Someone broke this." It looks broken because it is broken. That's the first thing you need to understand. JSX is a category error. Mixing markup and logic at the syntax level - not as an abstraction, not behind an interface, but visually, literally, right there in the file - is the kind of decision that should have ended careers. Instead it ended up on 40% of job postings. And here's the part that actually matters, the part that explains everything: Nobody can tell you why. "Everyone uses it." Go ahead, ask. That's the answer. That's the complete sentence, delivered with the confidence of someone who has never once questioned whether a thing should exist before learning how it works. The argument for React is React's market share. The case for Next.js is that your tech lead saw it on a conference talk in 2021 and it was already too late. You're supposed to hear this and nod - because if everyone's doing something, there must be a reason, right? The herd doesn't just run toward cliffs. Except. That's literally what herds do. The web development community, bless its heart, has a category of decision I can only call NPC behavior. Not an insult - a technical description. An NPC doesn't evaluate options. An NPC reads the room, finds the dominant pattern, and propagates it. React is on every job posting = React is what employers want = React is what I need to know = React is what I reach for. The loop closes. Nobody along the chain asked if it was right. They asked if it was safe. Safe to put on a resume. Safe to recommend. Safe to defend at the standup. React is the framework you choose when you've stopped choosing and started inheriting. The 10% who actually think about their tools - they're out there running Alpine.js. Which is 8kb. Does the same job. No build step required. Add an attribute, the thing works. Revolutionary concept. They're running htmx, which understood something profound: the web already has a protocol for moving data, and it was fine. You didn't need to rebuild HTTP in JavaScript. You just needed to reach for the right thing instead of the fashionable one. Let's talk performance, because "everyone uses it" is already bad enough before you look at what it actually does. React ships 40-100kb of runtime JavaScript before your application does a single thing. Your users wait while React bootstraps itself. Then it hydrates - a word that sounds refreshing and means "React redoes on the client what the server already did, because React can't help it." Then they invented Server Components to fix the problem of shipping too much JavaScript. The solution: ship different JavaScript, handled differently, with new mental models, new abstractions, new ways to get it wrong. They called it an innovation. I once worked with WordPress and React together. I want you to sit with that. Two philosophies, neither of which is actually correct, stacked on each other like a complexity casserole nobody ordered. WordPress solving 2003's problems with 2003's patterns. React solving 2003's problems with 2013's patterns that created 2023's problems. Together they achieved something genuinely special: all the drawbacks of both, and none of the advantages of either. The PHP you want but in a different way and the hydration you couldn't prevent, serving pages that load like it's apologizing for something. Twenty years building for the web and I've watched frameworks rise and fall like geological events. ColdFusion, anyone? Remember when Java applets were going to be everywhere? Flash was going to be the web. Then jQuery saved us. Then Angular saved us from jQuery. Then React saved us from Angular. Rescue upon rescue, each one leaving more complexity than it cleared, each one defended by exactly the same people who defended the last one, now wearing a different conference lanyard. ProcessWire. That's what I build with. Most developers have never heard of it - which is not a criticism, that's the evidence. You find ProcessWire because you went looking for something specific, evaluated it, and it fit. It doesn't have conference talks. It doesn't have a VC-funded developer relations team. It has a forum full of people who chose it. That's a different category of thing entirely. The same 10% who finds ProcessWire finds Alpine. Finds htmx. Makes decisions that don't optimize for defensibility in interviews. Builds websites that load fast because they don't carry React around everywhere they go. There's a physics concept called a local minimum. A place where a system settles because the immediate neighborhood looks stable - the energy gradient points upward in every direction, so the system stops. Stays. Convinces itself it's home. Even if a global minimum exists somewhere else, at lower energy, lighter, simpler - you'd have to climb first, and the herd doesn't climb. React is a local minimum. The web settled here when it got tired of looking. Stable enough. Defended by enough career investment. Surrounded by enough tooling and tutorials and framework-specific bootcamps that switching costs feel existential. The ground state - simpler, faster, closer to what the web actually is - sits somewhere else, past a hill that looks too steep from inside the valley. The ground state is always simpler. That's not a philosophical position. That's thermodynamics. They don't want you to know that.1 point
-
I use https://github.com/ChromeDevTools/chrome-devtools-mcp for that. Very fast. The thing about the plan is that it's supposed to be reviewed before it is being applied, haha. But if you trust it without at least a quick glance, ok. I think Cursor can play sounds when it needs your attention. You could use that to notify you when you have to click the button. Wow. The "Full ProcessWire API access - Query, create, update, and delete pages" is the most interesting for me here. Working right now on a single-file PW-API-docs database based on https://github.com/memvid/memvid. Has semantic vector search (local embedding model), BM25 and all that good stuff. Also supports CRUD. I fed it a good part of https://github.com/phlppschrr/processwire-api-docs/blob/main/api-docs/index.md . File currently has around 35MB. Search is blazingly fast. I implement it as portable skill, not as MCP. Needs a little more love and testing but I'll drop it soonish.1 point
-
Incidentally, when you mention you reached limits of performance, where and how did those limits manifest? I ran your query / issue through my AI Agent and had the following response. The Real Diagnosis Your friend likely experienced real slowness, but the cause is almost certainly inefficient query patterns (loading too many pages at once, N+1 queries, unindexed searches) rather than a fundamental limitation of pages-as-assets. These are the same problems any ORM-based system hits at scale, and they have well-known solutions. What MediaHub Would Need for 100K Assets Problem Fix Effort Crops filter loads all crop pages Direct SQL: SELECT DISTINCT master_id Small N+1 crop count per listing Batch query or denormalized count field Small LIKE search on title/desc Add FULLTEXT indexes Small Tag dropdown loads all tags Already paginated or use autocomplete Small No query caching Add WireCache for expensive queries Medium Standard pagination at scale PW handles this natively with start= + limit= Already done Streaming large exports Use $pages->findMany() (lazy loading) Small None of these require rethinking the data model. They're standard database optimization work. Summary The page-per-asset architecture is sound and scalable. ProcessWire's page model with proper indexing handles 100K+ pages without issue. The current MediaHub implementation has a handful of query patterns that would need optimization (mostly the crop filter and N+1 counts), but these are straightforward fixes. Switching to a JSON-in-field approach would solve the wrong problem while creating a bespoke data layer that loses most of PW's value and introduces its own harder-to-fix scaling issues. //END What do you think of above1 point
-
Hey David Thanks for that. I am planning another sprint next week. Right now, I am approaching this from a traditional image-as-page approach. Each crop variation of a 'master' image is also a separate page. That's possibly not as good for scalability but better from the point of view of having a new page image field listing all my images and crops. I dont have any sites with 10K + images so I'm happy enough with this solution but I will certainly reconsider and you've given me some other ideas too which I'll tackle next week. Will DM you re. other items. Cheers1 point
-
It looks fantastic so far. Great work. 😯 I'm currently struggling with similar problems and requirements for a media manager. However, it needs to be scalable enough to easily handle 100,000 assets (for a news portal). My experiments with the principle that each asset is also its own page quickly reached the limits of performance. I am therefore pursuing the approach of storing the actual asset paths in the database, i.e. as a JSON object in a separate input field. A well-known problem: what happens if you rename the asset in the file system? I am still in the testing phase to see if this can be solved with a simple SQL replace and update. However, I would be very happy to test your module. Maybe I don't have to reinvent the wheel after all. 🙂 Here are a few more wishes that I have as requirements in my projects: Folders: Customers love folders and folder trees. Multiple use: An asset is used multiple times on different pages; it must also be possible to have different descriptions here. With the MM module, for example, the description is global and cannot be changed individually for each reference. Language versions: An asset is only available for certain language versions, so it can be deactivated for EN, for example, and will not be displayed in this language in the frontend. Automatic categorisation: An API that enables automatic categorisation after uploading/saving. Example: In the ‘News’ template, there is a select field called ‘Category’. If a new image is uploaded in the edit mode of the page, it should be automatically loaded into the folder of the selected category or alternatively tagged. In the best case, this creates an automatic structure if you can define several fields such as ‘News/2026/Category’. The date is read from a date field and the category from a select field, etc.1 point
-
PromptAI v2.7 released Adds support for the fantastic RockPageBuilder module. Prompts are now usable in all text, file and image fields inside RPB blocks.1 point
-
It might be some years since this was released, but I've just found how useful this is combined with Repeater Matrix. I was looking at a complex site where the designer had asked for multiple different page layouts and I was trying to figure out how to avoid a chaotic mess of a huge numbers of fields and templates. Repeater Matrix solved half the problem, but working out how to apply the depth information was a different story. This module along with the example gave me the other half of the solution I needed. I'm working with Bootstrap, and it was easy to add a field in my repeater matrix types to specify the css classes for each element and have some types that are basically just containers with no direct content, but contain other blocks.1 point
-
1 point
-
A 100% protection is impossible in a browser environment, what a browser can show has to be local. But you can make it less easy to "steal" the files with preventing direct access and hotlinking to font files <FilesMatch "\.(woff|woff2)$"> RewriteEngine On RewriteCond %{HTTP_REFERER} !^https?://(www\.)?yourdomain\.com [NC] RewriteRule .* - [F] </FilesMatch> The most effective way to protect a font is to make it "incomplete" for anyone who steals it. By using the fonttools library (specifically pyftsubset), you can strip away all characters that aren't needed for your specific site. There is a little benefit too, the filesize shrinks 😉1 point
-
What a coincidence! I’d just released a big update to my PromptAI module when I saw your post 🤯 As far as I can tell from your examples, our use cases differ slightly. Looks really good!1 point
-
I have never been loyal to tools for the sake of it. If something stops earning its keep, I move on. The reason I have stayed with ProcessWire for close to ten years is simple: it continues to make sense for how I work. I still look after sites I built many years ago, and most of them just run. No rewrites, no upgrade stress, no feeling that past work is a liability. The API has stayed stable, and when it has changed, it has been deliberate and predictable. That matters when you are responsible for client sites long-term. What really locked me in early on was the front-end freedom. PW never told me how a site should look or behave. It gave me solid building blocks and allowed me to choose. I can build very different sites without switching platforms or fighting opinionated defaults, and that freedom is something I value. The forum is another reason I am still here. You, the people in this community, take the time to understand a problem before jumping to solutions. That is very rare. The discussions are thoughtful, practical, and grounded in real experience, and I have learned a lot simply by reading how others approach things. And finally, trust. I trust ProcessWire not to chase trends simply for attention, and not to trade clarity or performance for fashion. Ten years on, it still feels like a system built by people who actually build websites. For me, that combination has been hard to beat.1 point
-
I second this wholeheartedly. The community is among the very best out there, and the lack of opinion, the clear structure and the ease of extending make PW a wonderful tool. It's a sad fact that my days of working with ProcessWire are mostly over. My job responsibilities have changed over time, and the demand for wholly integrated cloud systems led my employer to migrate our intranet site with tens of thousands of pages and a lot of advanced functionality to another platform (let's not talk about the manpower needed to do that and the gaps left). There are of course advantages, but I can say that we had a tailored-to-fit solution on a level you don't find often, from ordering breakfast or lunch from local suppliers, over advanced forms connected to HR systems and Active Directory data, providing specialized integrated databases and automated workflows to our departments, to driving technical sales with dynamically generated interlinked views on bills of material, stocks and data sheets pulled directly from SAP. A piece of software more than 60% of 1300 worldwide employees used daily and that ran with 100.0% availability on a single IIS server with only a bit of memcache magic to keep things speedy. Over more than ten years, periodic updates went through with nary a hitch. My heart bleeds a bit. Not working with PW every week also means that I'm not actively using the modules I built anymore. I'll have to go over my little babies one by one, retire those that have been surpassed by better approaches by now and find new pet owners for the others.1 point
-
Generates a .phpstorm.meta.php file for ProcessWire autocompletion in PhpStorm. Features Autocomplete wire container keys for wire('...') and Wire::wire('...') Autocomplete module names for Modules::get() and Modules::install() Autocomplete field names for Fields::get() Autocomplete template names for Templates::get() Autocomplete unique page names for Pages::get() Autocomplete hookable methods for Wire::addHook*() Autocomplete page status constants/strings for Page::status(), addStatus(), removeStatus(), hasStatus() Autocomplete field flags for Field::addFlag(), removeFlag(), hasFlag() Autocomplete template cache-expire constants for Template::cacheExpire() Autocomplete Inputfield collapsed constants for Inputfield::collapsed() Autocomplete sort flags for WireArray::sort()/sortFlags()/unique() and PageArray::sort()/sortFlags()/unique() Optional: Field type autocompletion per Page class (when enabled in module config) Usage Default path: site/assets/.phpstorm.meta.php (configurable in module settings). The file regenerates automatically when fields, templates, or modules change (debounced). You can manually regenerate from the module settings screen. Optional: enable "Generate page-class field metadata" in module settings for field type hints per Page class. This is intentionally basic. For richer field stubs, use AutoTemplateStubs. Examples Modules $tracy = $modules->get('TracyDebugger'); // Autocomplete + correct class type for navigation and code insight Wire Container $page = wire('page'); $pages = $this->wire('pages'); $cache = wire('cache'); // Autocomplete for keys like page/pages/cache/etc. Fields $body = $fields->get('body'); // Autocomplete field names, fewer typos Templates $tpl = $templates->get('basic-page'); // Autocomplete template names Pages $home = $pages->get('/'); // Maps to the page class when page classes are enabled Page Status $page->status(Page::statusHidden); $page->addStatus('draft'); $page->removeStatus(Page::statusUnpublished); $page->hasStatus('locked'); Field Flags $field->addFlag(Field::flagAutojoin); $field->removeFlag(Field::flagAccess); $field->hasFlag(Field::flagGlobal); Template Cache Expire $template->cacheExpire(Template::cacheExpireParents); Inputfield Collapsed $inputfield->collapsed(Inputfield::collapsedYesAjax); Sort Flags $items->sort('title', SORT_NATURAL | SORT_FLAG_CASE); $items->sortFlags(SORT_NATURAL); $items->unique(SORT_STRING); Page-Class Field Metadata (Optional) $home = $pages->get('/'); // $home is HomePage (page class) // Field types are inferred from the template fieldgroup // e.g. $home->hero_image -> Pageimage or Pageimages depending on field settings Hooks $wire->addHookAfter('Pages::save', function($event) { // Autocomplete hookable methods while typing the hook string }); Notes Hook scanning reads ProcessWire core, modules, and admin templates to build the hook list. If page classes are enabled, page names map to their page class; otherwise they map to Page. Improvement suggestions and PRs are welcome. https://github.com/phlppschrr/ProcessWirePhpStormMeta1 point
-
Some more work. Not sure why the images are so fuzzy Image detail page You can edit the usual stuff...Title, Alt, Description, and add Tags etc Some utilities in there too such as Download, Copy URl, Duplicate, Delete If an image has Crop versions they are displayed under the main image Crop versions has thumbs and table view A crop version has a detailed view too Image Crop page There are presets, but you can create your own named crops Save as a crop version or save as a new image. The hardest part is in progress which is A custom Inputfield which allows you to add images from the MediaHub. All while maintaining a connection back to the hub source file. IE it's important that there's no duplication and that the images in the Media Hub are a true source / canonical version Displaying images in the page edit field in a nice consistent way with the existing UI. As much as possible, I want the user to feel like this the core and not some bolt-on with its own CSS. Although I am changing some things… Hope you like! P1 point
-
Hey everyone, I have some updates to MediaHub to share. Media Hub view Screenshot of the Grid view... This is the grid view showing a thumbnail of all your images. Each card has helpful meta data (PNG, file size etc) Some images have crop applied denoted by the small pink badge. IE Lisbon tiles has 4 crop versions. Usual filters at the top and a search bar. Screenshot of the Table view. Handy if you have hundreds of images Displays tags too Screenshot of the Upload / Drag and drop mode There's some nice aniamtion / UI when the system is uploading several images Tomorrow I'll share more...1 point
-
We've been working on a project in ProcessWire for the x time, and the more we use it, the more amazed we are by what this incredible CMS can do. From an SEO, developer, usability, and customization perspective, it’s truly outstanding. My team was deeply involved with Joomla! for 10 years - since its foundation - so we’ve seen a lot. After years of using ProcessWire, I just want to thank @ryan and everyone who has contributed - whether through code, ideas, support, or anything else. What a beauty, what a powerful CMS! 🚀1 point