Jump to content

Using DDEV for local ProcessWire development (tips & tricks)


bernhard

Recommended Posts

Firstly, many thanks for the brilliant input regarding DDEV. I'm finally using DDEV for local development and it's really super fast. My latest project is now 100% finished locally and I saved a lot of waiting time.

@bernhard I understood from your posts that you have a staging version and a production version of your projects on the remote server.

- What is the benefit of a remote staging version for a single developer? Wouldn't it be easier to omit the staging server?

- What I still can't deal with is the following problem: If I put the website live now, my customer and their team will start editing content, creating pages, etc. in the ProcessWire backend. In addition, there will be contact form entries and blog comments from end users, i.e. ultimately user-generated content, which will also end up in the ProcessWire database.

The website is to be continuously developed, i.e. I will be adding fields, modules and templates over the coming weeks and months.

I understood, you would clone the database to local and continue working on the website locally, and later copy everything back to remote. But then I would overwrite the changes my client and the users have made in the meantime.

How do you deal with this problem?
Are you developing the site locally, and once deployed to production, do you work on the remote server? I tried to connect to a server database remotely, and it is super slow.

Before DDEV, I had the classic setup with uploading every change to the remote server via FTP and refreshing the browser manually. I do not really want to go back to that ...

Appreciating any help ... !

Link to comment
Share on other sites

2 minutes ago, nurkka said:

What is the benefit of a remote staging version for a single developer

That isn't a question of one or multiple developers. I mostly use a staging server to showcase new functionality before it will get added to the live server.

3 minutes ago, nurkka said:

I understood, you would clone the database to local and continue working on the website locally, and later copy everything back to remote. But then I would overwrite the changes my client and the users have made in the meantime.

We use RockMigrations to ensure our development environment matches the live server. This allows Bernhard and me to add new fields and templates without risk of overwriting existing data during uploads, as any changes are automatically migrated

  • Like 1
Link to comment
Share on other sites

36 minutes ago, dotnetic said:

We use RockMigrations to ensure our development environment matches the live server. This allows Bernhard and me to add new fields and templates without risk of overwriting existing data during uploads, as any changes are automatically migrated

I have also tried RockMigrations, but I still don't use it as standard. I'll take another look at RockMigrations, thank you very much!

Do I understand you correctly that when further developing a website that is already live, only templates, fields and modules can be updated?

For example, when I create a new page type, I not only create a new template and new fields, but also one or more pages and fill their fields with content.

Can this newly created content also be transferred to the production environment with RockMigrations? If so, that would mean that you have to create not only the new fields, but also all the new content in php code, and not via the ProcessWire backend, right?

Link to comment
Share on other sites

59 minutes ago, nurkka said:

I have also tried RockMigrations, but I still don't use it as standard. I'll take another look at RockMigrations, thank you very much!

It is built for exactly the use case you have mentioned: Working on a PW website/app while content is changing on the production system and continuously pushing updates to the live system without losing data and without any downtimes.

1 hour ago, nurkka said:

Do I understand you correctly that when further developing a website that is already live, only templates, fields and modules can be updated?

No, you can change everything. But you need to understand how things work and how they play together to not mess up anything. For example you could do a $rockmigrations->deletePage(...) and if that page is needed for anything else it would break your site.

1 hour ago, nurkka said:

For example, when I create a new page type, I not only create a new template and new fields, but also one or more pages and fill their fields with content.

That's something that is not yet built into RockMigrations. The reason is that there are so many ways to build a website in PW that you can't really have a simple $rockmigrations->pushContentToServer(...) or something. Also we'd have to deal with different page IDs on different systems etc etc...

If you are building a module, then you have to do these things in code. It has to be reusable afterall. RockMigrations has several helpers for that, for example for autoloading pageclasses that you only have to put into the right folder. For example in RockCommerce I need one root data page. I have a pageclass for that rootpage and I create that rootpage at /rockcommerce and below that page all other pages are stored, like invoices or orders etc. That's quite easy to handly as I can identify pages by their names and name collisions are theoretically possible, but very unlikely (who would have a rockcommerce page in his/her installation?!).

When working on a new section of a page, for example you add a blog to your existing website, things are different. You need lots of content and you only need it once - not as a reusable module. In that case I'd build the structure with RockMigrations and then push the changes to the remote server. The remote server can either be a staging server to make sure I don't break production or to preview changes to the client while the old version is still on production and while my local laptop might be switched off.

Simple changes for simple websites can be pushed to production directly. Then you can populate the pages there via GUI and once done you can do "rockshell db:pull" and you have all your content on your local DDEV! What about the images, you ask? RockMigrations will automatically download them from the remote server while you are viewing backend or frontend pages and those files are requested. Just set $config->filesOnDemand = "https://www.example.com";

You could even create all fields and templates on production and do a "rockshell db:pull" regularly if changes are one-time and don't need to be reusable. RockMigrations does not force you into any workflow here - it just offers the tools to make things easier.

Link to comment
Share on other sites

  • 2 weeks later...

@bernhard @dotnetic Thanks for your replies!

In my current project I followed your advice and now work with a local ddev, a staging server and a live server. Currently I still have no automated scripts yet, but working with the command line (wsl2, ddev and ssh) works really good. Also, I mostly got rid of large ftp uploads, using rsync. Working with that tools feels better every day 🙂 

But when trying to copy the local ddev website to the staging server, I got stuck. I exported the database with ddev php RockShell/rock db:dump, uploaded it via rsync. connected via SSH and tried php RockShell/rock db:restore on the server. But as the ProcessWire database tables were not present at this point yet, RockShell/rock returned an error message. So I had to fall back to Adminer to import the SQL file. 

What would be the right way to copy a locally developed website to a staging server with RockShell? 

Link to comment
Share on other sites

12 hours ago, nurkka said:

But when trying to copy the local ddev website to the staging server, I got stuck. I exported the database with ddev php RockShell/rock db:dump, uploaded it via rsync. connected via SSH and tried php RockShell/rock db:restore on the server. But as the ProcessWire database tables were not present at this point yet, RockShell/rock returned an error message. So I had to fall back to Adminer to import the SQL file. 

What would be the right way to copy a locally developed website to a staging server with RockShell? 

Thx for that question, I think that's not 100% at the moment from the docs. I wanted to do a video on this topic for a long time, but videos are very time consuming.

You have to setup the whole website first, copy everything manually to the remote server and from that time on you can use rockmigrations for deployment (if you want) and rockshell for db operations. Hope that helps!

  • Like 1
Link to comment
Share on other sites

@bernhard Thanks again for your help! I finally managed to also get SSH keys to work under Windows with WSL2 and with DDEV installed in Windows, so that now I am able to use RockShell how it's meant to be 🙂 

The key point, why I struggled with the ssh keys was, because I had put the key files in the ~/.ssh folder on the WSL2-Linux installation. But as I am using the Windows version of DDEV and using the command line from the windows side, the ssh keys must be placed in C:\Users\YourNameHere\.ssh

After that, one can simply use this command, which will copy the ssh keys into one's web container:

ddev auth ssh

Then one can use e.g. the following RockShell command like so:

ddev php RockShell/rock db:pull

* * * * *

And here is how to add a shell function under Windows, when using PowerShell 5:

# test, if the PowerShell profile, where you can store custom shell functions, already exists
Test-Path $PROFILE

# if not, create the PowerShell profile
New-Item -path $PROFILE -type file -force

# Open the profile file with an editor. it's located here:
# Path: C:\Users\YourNameHere\Documents\WindowsPowerShell
# Name: Microsoft.PowerShell_profile.ps1

# Add the shell function to the profile file:
function rockshell {
    param(
        [string]$Command
    )
    ddev exec php RockShell/rock $Command
}

After that, save the file, restart PowerShell, and now it is possible to use RockShell like so:

rockshell db:pull

Many thanks @bernhard for creating RockShell, and for your help!

  • Thanks 1
Link to comment
Share on other sites

Have you ever had to revive an old project for whatever reason? Was it a pain to get it up and running, because you switched to a totally new laptop in the meantime?

I had this situation today and it was no problem at all thx to DDEV 😎

Quote

ddev start

--> got an error that DDEV can't boot up the project because the path changed from /my/old/laptop/projectX to /my/new/laptop/projectX

But it also showed the solution: 

Quote

ddev stop --unlist projectX

And then again:

Quote

ddev start

--> Browser popped up, table xy not found - well yes, obviously I don't have the database on my new laptop!

Quote

ddev import-db -f site/assets/backups/database/db.sql

Boom, everything works! With a quite old project with a quite outdated version of PHP 😎🚀

  • Like 2
Link to comment
Share on other sites

  • 2 weeks later...

I’m not quite sure that I understand the purpose of ‘upload_dirs’:

Quote

When Mutagen is enabled, DDEV attempts to exclude user-generated files in upload_dirs (if they exist) from syncing.


That’s from https://ddev.readthedocs.io/en/stable/users/install/performance/#mutagen-and-user-generated-uploads.

I think it isn’t really about uploads at all, but rather directories that might be better off with a slow-but-sure sync rather than a fast-but-eventual sync?

If that’s right, when would you actually need something like that — especially in the context of uploading?

Does anyone have a feel for this?

  • Like 1
Link to comment
Share on other sites

I've only just spotted this but, in my ddev environment, if I modified a page the modified date was always recorded as one hour earlier.

So 

Quote

Last modified by admin on 2024-04-18 13:55:19 (1 hour ago)

appears on the settings tab immediately after doing a modification (at 14:55).

The production environment was OK. I have $config->timezone = 'Europe/London'; throughout in config.php. It seems that you also need to set the timezone (in my case 'timezone: Europe/London') in the ddev config.yaml

  • Like 2
Link to comment
Share on other sites

On 4/18/2024 at 9:11 AM, MarkE said:

I've only just spotted this but, in my ddev environment, if I modified a page the modified date was always recorded as one hour earlier.

So 

appears on the settings tab immediately after doing a modification (at 14:55).

The production environment was OK. I have $config->timezone = 'Europe/London'; throughout in config.php. It seems that you also need to set the timezone (in my case 'timezone: Europe/London') in the ddev config.yaml

For handling datetime in Processwire, I've come to realize that the best way for me to avoid the most headache is leave EVERYTHING in UTC...the $config file, the ddev settings, etc.

Then whenever I need to display times on the front end, I use a helper function I created that formats time in the Timezone of my choosing, at display time. 

With this method, the production server timezone or my dev environment timezone never matters.

The added benefit is that you can have users save a timezone to their profile and then display times in their timezone.

Any other way of dealing with timezone besides saving UTC to database has always created a mess for me later down the road.

 function UtcToCst($timestamp)
{
    if(!$timestamp) return;
    return (new \DateTime("@" . $timestamp))->setTimezone(new \DateTimeZone("America/Chicago")); //Or use a timezone saved on your $user template.
}

 

  • Like 3
Link to comment
Share on other sites

@bernhard: Thanks to your forum thread, I finally did the switch from my previously used XAMPP development stack to WSL2/DDEV on my Windows 11 machine. Now I do all my Python, PHP and Node/Html projects within the Linux subsystem on Windows 11 using VS Code as my editor of choice. Only Windows Desktop C# projects (WinForms, WPF) are still done on the Windows side using Visual Studio 2022.

Always wanted SSL certificates on my localhost. Installation was easy and my first project was set up within minutes by simply cloning and importing my MySQL dump into my ddev container.

Thanks for the inspiration.

 

  • Like 3
Link to comment
Share on other sites

Fun fact. Creating a local backup of a live Wordpress site was easier with DDEV too. MySQL and PHP versions can easily be set in .ddev/config.yml to match the LIVE versions. Adding a POST import-db hook automatically replaces the hardcoded site URLs in the database.sql dump to match the URL on my localhost when running ddev import-db —file=database.sql.

  • Like 1
Link to comment
Share on other sites

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...