• Content Count

  • Joined

  • Last visited

  • Days Won


Mikie last won the day on December 6

Mikie had the most liked content!

Community Reputation

29 Excellent

About Mikie

  • Rank
    Jr. Member

Profile Information

  • Gender
  • Location

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. Mikie

    Not sure, but I think they’d be smart enough to recognize the same card. That credit would cover multiple projects tho.
  2. Mikie

    For all the reasons people here love ProcessWire, thought I'd share a great accounting app with a similar feel / philosophy. It's called Manager https://www.manager.io and it is amazing. I used to use it years ago and am now moving back from Quickbooks, it almost makes me enjoy doing my books.
  3. Mikie

    I only use Cloud for Compute and Storage (the billing is really flexible), but looking here it seems that credit is per billing account: https://developers.google.com/maps/billing/understanding-cost-of-use
  4. Mikie

    Hi @bernhard, I appreciate and understand all your comments, but I feel like you are missing the point. There are two concepts/workflows at play, and whilst they aren't mutually exclusive they do serve different purposes. Migrations = manually migrate config and content changes by writing php using the API JSON Export/Import = automagically migrate config (not content!!!) changes you have made in the admin panel Both have pros and cons and in some situations one or the other may not even be viable, therefore the choice is important. I personally wish you could export field definitions as php (like you can with ACF), which is sort of a middle-ground between both these options. To be clear, my feature request was just about updating current core Import/Export to (optionally) enable a more seamless workflow like the below screencast. In it I migrate a new field defined in the admin from one wordpress installation to another in seconds. You cannot do this with Migrations, and with version control and the right logic for syncing this is a completely viable way to migrate config. I am not suggesting this should be the canonical way to migrate config, just one option that sits alongside something like Migrations.
  5. Mikie

    @arjen I agree with this, it is by no means perfect. Eg with ACF if you have latest database but old json then you will be prompted to sync changes (which would be reversion). I feel like this is a shortcoming with the ACF implementation though, surely field updates could be timestamped? Maybe I am not thinking the complexities through. However for a single dev its pretty simple, and with multiple devs there should always be a repo and database that can be considered the source of truth. If things get out whack just pull from latest.
  6. Mikie

    Thanks @bernhard, I've added my thoughts over in that discussion:
  7. Mikie

    Since I posted a feature request for something similar without knowing this discussion was happening, just thought I'd chime in. Obvjously json sync has drawbacks, but here is the workflow I have experienced using ACF with Wordpress that works really well: LOCAL (clone of REMOTE) | REMOTE (source of truth) LOCAL - work on fields and perhaps content / other config such as templates LOCAL - check auto generated field json into version control for other devs to pull, or push to REMOTE as part of deployment REMOTE - sync field updates in admin, and if other config / content mismatches then things fail gracefully To explain the last step, say you have added the field to a template on LOCAL that doesn't exist yet on REMOTE, then on REMOTE nothing goes wrong and if you re-save the field on REMOTE it just updates to the current state there. This is about syncing fields, not other config, so things should just fall back to default if relying on other config that could conflict. I can definitely see how content (not other config) can quickly become a problem when moving outside of this workflow, but if all you are doing is testing things locally, pushing to live, then syncing the database from live back down to local, this is a really quick and efficient way of iterating field updates and sharing with other devs.
  8. It would be handy to automate import and export of fields across installations. The reference for this is the wp ACF json functionality, which in practice allows checking field changes into version control and syncing field config updates without having to worry about the database. This would be achieved by automatically creating json export files when saving a field, checking these json files for changes in the fields list, and providing ui to import the changes if wanted. It seems like a module could easily manage this using available hooks, but just wondering if others would find this handy or if it is on any sort of roadmap. Sorry if I am doubling up on this, there is lots of talk around different modules like Migrations but nothing specifically about automating current field import / export functionality.
  9. Mikie

    Hey @Tom. JAMstack itself is just a marketing term created by Netlify’s CEO to onboard people onto their hosting platform and CMS. Netlify CMS is not really a CMS, it is an admin panel that integrates with static site generators like Hugo and Jekyll. The use case for static site generators is usually very simple sites with basic structured content like blogs and portfolios. See examples here https://jamstack.org/examples/ The JAMstack methodology also encourages using cloud CMS platforms like Contentful, however there are benefits and drawbacks to this I won't go into. To set up something like what you are asking about (but way more fun and flexible) with Processwire, imagine you have a domain http://api.website.com pointed to a Processwire instance with a homepage template like this: <?php namespace ProcessWire; header('HTTP/1.1 200'); header('Content-Type: application/json,charset=utf-8'); header("access-control-allow-origin: *"); $data = []; $projects = $pages->find("template=project"); foreach ($projects as $p) { $data[] = [ $title -> $p->title; $content -> $p->sometextareafield; ] } echo json_encode($data); and the main domain http://website.com is just a static hosted at netlify and your index.html contains the below: <body> <div id="projects"></div> <script type="text/javascript"> $(document).ready(function() { $.ajax({url: "http://api.website.com"}).done(function(data) { var $projects = $('#projects'); data.forEach(function(project) { var html = "<div class='project'>" + "<h2>" + project.title + "</h2>" + project.content + "</div>"; $projects.append(html); }); }); }); </script> </body> ... then you have the beginnings of a single page javascript app. Of course building an entire site like this and incorporating routing and state management etc would become tiresome very quickly in jQuery, which is why javascript frameworks like React and Vue exist. Regarding build tools and task runners like Webpack and Gulp, they exist to do what they say, i.e. bundle up static assets or run tasks. Before getting to that tho, it is important to understand the way people use node and npm for front end dev. See this article maybe: https://www.impressivewebs.com/npm-for-beginners-a-guide-for-front-end-developers/ I personally hate Webpack, I use npm scripts and the CLIs of my preferred libraries to do everything. I can post an example if you like, but these blog posts sum it up: https://deliciousbrains.com/npm-build-script/ https://www.keithcirkel.co.uk/how-to-use-npm-as-a-build-tool/
  10. Mikie

    What source control do you use? For jobs where client are on shared hosting https://github.com/git-ftp/git-ftp works really well for basic deployment and even rollbacks if necessary, I can tell you my workflow with it if you'd like but it's pretty simple. Re the database, not sure exactly what you are after. If I have ssh access I use a bash script I based off this one https://gist.github.com/samhernandez/25e26269438e4ceaf37f to sync local and remote databases and files. I use it to pull down latest state from the server, but you could also use it to push from local I guess, or sync between staging and live.
  11. Thanks for the update. Makes sense you want to keep this functionality in that separate.
  12. Hi @kongondo, sorry for any confusion. I am purely talking about a frontend account area for customers where they can review orders, manage account info etc whilst staying on brand to the look and feel of the shop. When I mention authentication I meant incorporating a login / registration flow similar to Ryan's Login/Register module. See some screenshots from WooCommerce Storefront theme below. The code for this is here, but I can spin up a demo for you if you want.
  13. Excited about this. One feature request that I am not sure is covered above (apologies if I missed it and am doubling up) is a front-end customer account area. Again, woocommerce is a good reference point for the basics involved. A separate optional module would make sense, and ability to style / customise as needed would be important. Maybe it could be less of a predefined account area and more of a module to manage users, authentication and customer endpoints on the frontend.
  14. Mikie

    Hey glad you figured out, just realised I'd misunderstood previous post.
  15. Mikie

    A few things... With the axios call, try querying the relative and not the absolute url. Also, take note of the trailing slash on the url, sometimes that can trip you up depending on your template settings. Finally, you probably don't need to set the entire response to your Vue data object, with axios you are looking for response.data. axios.get('/ajax-actions/test-api/).then(response => (this.info = response.data)); If you need to keep the full absolute url, or if you are calling the api page from a different url, then look into CORS. Setting the header below on your template should be enough, depends on your setup (eg see this topic for issues with CORS when using procache). <php header('Content-Type: application/json,charset=utf-8'); header("access-control-allow-origin: *"); //... ?>