Jump to content

Using DDEV for local ProcessWire development (tips & tricks)


bernhard

Recommended Posts

Firstly, many thanks for the brilliant input regarding DDEV. I'm finally using DDEV for local development and it's really super fast. My latest project is now 100% finished locally and I saved a lot of waiting time.

@bernhard I understood from your posts that you have a staging version and a production version of your projects on the remote server.

- What is the benefit of a remote staging version for a single developer? Wouldn't it be easier to omit the staging server?

- What I still can't deal with is the following problem: If I put the website live now, my customer and their team will start editing content, creating pages, etc. in the ProcessWire backend. In addition, there will be contact form entries and blog comments from end users, i.e. ultimately user-generated content, which will also end up in the ProcessWire database.

The website is to be continuously developed, i.e. I will be adding fields, modules and templates over the coming weeks and months.

I understood, you would clone the database to local and continue working on the website locally, and later copy everything back to remote. But then I would overwrite the changes my client and the users have made in the meantime.

How do you deal with this problem?
Are you developing the site locally, and once deployed to production, do you work on the remote server? I tried to connect to a server database remotely, and it is super slow.

Before DDEV, I had the classic setup with uploading every change to the remote server via FTP and refreshing the browser manually. I do not really want to go back to that ...

Appreciating any help ... !

Link to comment
Share on other sites

2 minutes ago, nurkka said:

What is the benefit of a remote staging version for a single developer

That isn't a question of one or multiple developers. I mostly use a staging server to showcase new functionality before it will get added to the live server.

3 minutes ago, nurkka said:

I understood, you would clone the database to local and continue working on the website locally, and later copy everything back to remote. But then I would overwrite the changes my client and the users have made in the meantime.

We use RockMigrations to ensure our development environment matches the live server. This allows Bernhard and me to add new fields and templates without risk of overwriting existing data during uploads, as any changes are automatically migrated

  • Like 1
Link to comment
Share on other sites

36 minutes ago, dotnetic said:

We use RockMigrations to ensure our development environment matches the live server. This allows Bernhard and me to add new fields and templates without risk of overwriting existing data during uploads, as any changes are automatically migrated

I have also tried RockMigrations, but I still don't use it as standard. I'll take another look at RockMigrations, thank you very much!

Do I understand you correctly that when further developing a website that is already live, only templates, fields and modules can be updated?

For example, when I create a new page type, I not only create a new template and new fields, but also one or more pages and fill their fields with content.

Can this newly created content also be transferred to the production environment with RockMigrations? If so, that would mean that you have to create not only the new fields, but also all the new content in php code, and not via the ProcessWire backend, right?

Link to comment
Share on other sites

59 minutes ago, nurkka said:

I have also tried RockMigrations, but I still don't use it as standard. I'll take another look at RockMigrations, thank you very much!

It is built for exactly the use case you have mentioned: Working on a PW website/app while content is changing on the production system and continuously pushing updates to the live system without losing data and without any downtimes.

1 hour ago, nurkka said:

Do I understand you correctly that when further developing a website that is already live, only templates, fields and modules can be updated?

No, you can change everything. But you need to understand how things work and how they play together to not mess up anything. For example you could do a $rockmigrations->deletePage(...) and if that page is needed for anything else it would break your site.

1 hour ago, nurkka said:

For example, when I create a new page type, I not only create a new template and new fields, but also one or more pages and fill their fields with content.

That's something that is not yet built into RockMigrations. The reason is that there are so many ways to build a website in PW that you can't really have a simple $rockmigrations->pushContentToServer(...) or something. Also we'd have to deal with different page IDs on different systems etc etc...

If you are building a module, then you have to do these things in code. It has to be reusable afterall. RockMigrations has several helpers for that, for example for autoloading pageclasses that you only have to put into the right folder. For example in RockCommerce I need one root data page. I have a pageclass for that rootpage and I create that rootpage at /rockcommerce and below that page all other pages are stored, like invoices or orders etc. That's quite easy to handly as I can identify pages by their names and name collisions are theoretically possible, but very unlikely (who would have a rockcommerce page in his/her installation?!).

When working on a new section of a page, for example you add a blog to your existing website, things are different. You need lots of content and you only need it once - not as a reusable module. In that case I'd build the structure with RockMigrations and then push the changes to the remote server. The remote server can either be a staging server to make sure I don't break production or to preview changes to the client while the old version is still on production and while my local laptop might be switched off.

Simple changes for simple websites can be pushed to production directly. Then you can populate the pages there via GUI and once done you can do "rockshell db:pull" and you have all your content on your local DDEV! What about the images, you ask? RockMigrations will automatically download them from the remote server while you are viewing backend or frontend pages and those files are requested. Just set $config->filesOnDemand = "https://www.example.com";

You could even create all fields and templates on production and do a "rockshell db:pull" regularly if changes are one-time and don't need to be reusable. RockMigrations does not force you into any workflow here - it just offers the tools to make things easier.

Link to comment
Share on other sites

  • 2 weeks later...

@bernhard @dotnetic Thanks for your replies!

In my current project I followed your advice and now work with a local ddev, a staging server and a live server. Currently I still have no automated scripts yet, but working with the command line (wsl2, ddev and ssh) works really good. Also, I mostly got rid of large ftp uploads, using rsync. Working with that tools feels better every day ? 

But when trying to copy the local ddev website to the staging server, I got stuck. I exported the database with ddev php RockShell/rock db:dump, uploaded it via rsync. connected via SSH and tried php RockShell/rock db:restore on the server. But as the ProcessWire database tables were not present at this point yet, RockShell/rock returned an error message. So I had to fall back to Adminer to import the SQL file. 

What would be the right way to copy a locally developed website to a staging server with RockShell? 

Link to comment
Share on other sites

12 hours ago, nurkka said:

But when trying to copy the local ddev website to the staging server, I got stuck. I exported the database with ddev php RockShell/rock db:dump, uploaded it via rsync. connected via SSH and tried php RockShell/rock db:restore on the server. But as the ProcessWire database tables were not present at this point yet, RockShell/rock returned an error message. So I had to fall back to Adminer to import the SQL file. 

What would be the right way to copy a locally developed website to a staging server with RockShell? 

Thx for that question, I think that's not 100% at the moment from the docs. I wanted to do a video on this topic for a long time, but videos are very time consuming.

You have to setup the whole website first, copy everything manually to the remote server and from that time on you can use rockmigrations for deployment (if you want) and rockshell for db operations. Hope that helps!

  • Like 1
Link to comment
Share on other sites

@bernhard Thanks again for your help! I finally managed to also get SSH keys to work under Windows with WSL2 and with DDEV installed in Windows, so that now I am able to use RockShell how it's meant to be ? 

The key point, why I struggled with the ssh keys was, because I had put the key files in the ~/.ssh folder on the WSL2-Linux installation. But as I am using the Windows version of DDEV and using the command line from the windows side, the ssh keys must be placed in C:\Users\YourNameHere\.ssh

After that, one can simply use this command, which will copy the ssh keys into one's web container:

ddev auth ssh

Then one can use e.g. the following RockShell command like so:

ddev php RockShell/rock db:pull

* * * * *

And here is how to add a shell function under Windows, when using PowerShell 5:

# test, if the PowerShell profile, where you can store custom shell functions, already exists
Test-Path $PROFILE

# if not, create the PowerShell profile
New-Item -path $PROFILE -type file -force

# Open the profile file with an editor. it's located here:
# Path: C:\Users\YourNameHere\Documents\WindowsPowerShell
# Name: Microsoft.PowerShell_profile.ps1

# Add the shell function to the profile file:
function rockshell {
    param(
        [string]$Command
    )
    ddev exec php RockShell/rock $Command
}

After that, save the file, restart PowerShell, and now it is possible to use RockShell like so:

rockshell db:pull

Many thanks @bernhard for creating RockShell, and for your help!

  • Thanks 1
Link to comment
Share on other sites

Have you ever had to revive an old project for whatever reason? Was it a pain to get it up and running, because you switched to a totally new laptop in the meantime?

I had this situation today and it was no problem at all thx to DDEV ?

Quote

ddev start

--> got an error that DDEV can't boot up the project because the path changed from /my/old/laptop/projectX to /my/new/laptop/projectX

But it also showed the solution: 

Quote

ddev stop --unlist projectX

And then again:

Quote

ddev start

--> Browser popped up, table xy not found - well yes, obviously I don't have the database on my new laptop!

Quote

ddev import-db -f site/assets/backups/database/db.sql

Boom, everything works! With a quite old project with a quite outdated version of PHP ??

  • Like 2
Link to comment
Share on other sites

  • 2 weeks later...

I’m not quite sure that I understand the purpose of ‘upload_dirs’:

Quote

When Mutagen is enabled, DDEV attempts to exclude user-generated files in upload_dirs (if they exist) from syncing.


That’s from https://ddev.readthedocs.io/en/stable/users/install/performance/#mutagen-and-user-generated-uploads.

I think it isn’t really about uploads at all, but rather directories that might be better off with a slow-but-sure sync rather than a fast-but-eventual sync?

If that’s right, when would you actually need something like that — especially in the context of uploading?

Does anyone have a feel for this?

  • Like 1
Link to comment
Share on other sites

I've only just spotted this but, in my ddev environment, if I modified a page the modified date was always recorded as one hour earlier.

So 

Quote

Last modified by admin on 2024-04-18 13:55:19 (1 hour ago)

appears on the settings tab immediately after doing a modification (at 14:55).

The production environment was OK. I have $config->timezone = 'Europe/London'; throughout in config.php. It seems that you also need to set the timezone (in my case 'timezone: Europe/London') in the ddev config.yaml

  • Like 2
Link to comment
Share on other sites

On 4/18/2024 at 9:11 AM, MarkE said:

I've only just spotted this but, in my ddev environment, if I modified a page the modified date was always recorded as one hour earlier.

So 

appears on the settings tab immediately after doing a modification (at 14:55).

The production environment was OK. I have $config->timezone = 'Europe/London'; throughout in config.php. It seems that you also need to set the timezone (in my case 'timezone: Europe/London') in the ddev config.yaml

For handling datetime in Processwire, I've come to realize that the best way for me to avoid the most headache is leave EVERYTHING in UTC...the $config file, the ddev settings, etc.

Then whenever I need to display times on the front end, I use a helper function I created that formats time in the Timezone of my choosing, at display time. 

With this method, the production server timezone or my dev environment timezone never matters.

The added benefit is that you can have users save a timezone to their profile and then display times in their timezone.

Any other way of dealing with timezone besides saving UTC to database has always created a mess for me later down the road.

 function UtcToCst($timestamp)
{
    if(!$timestamp) return;
    return (new \DateTime("@" . $timestamp))->setTimezone(new \DateTimeZone("America/Chicago")); //Or use a timezone saved on your $user template.
}

 

  • Like 3
Link to comment
Share on other sites

@bernhard: Thanks to your forum thread, I finally did the switch from my previously used XAMPP development stack to WSL2/DDEV on my Windows 11 machine. Now I do all my Python, PHP and Node/Html projects within the Linux subsystem on Windows 11 using VS Code as my editor of choice. Only Windows Desktop C# projects (WinForms, WPF) are still done on the Windows side using Visual Studio 2022.

Always wanted SSL certificates on my localhost. Installation was easy and my first project was set up within minutes by simply cloning and importing my MySQL dump into my ddev container.

Thanks for the inspiration.

 

  • Like 3
Link to comment
Share on other sites

Fun fact. Creating a local backup of a live Wordpress site was easier with DDEV too. MySQL and PHP versions can easily be set in .ddev/config.yml to match the LIVE versions. Adding a POST import-db hook automatically replaces the hardcoded site URLs in the database.sql dump to match the URL on my localhost when running ddev import-db —file=database.sql.

  • Like 1
Link to comment
Share on other sites

  • 2 weeks later...

I didn't know about DDEV until recently. It's been a gamechanger and I have yet to even harness it's full potential. I use it on everything now and can't speak highly enough of it. Many thanks to @bernhard

  • Like 4
Link to comment
Share on other sites

1 hour ago, FireWire said:

I didn't know about DDEV until recently. It's been a gamechanger and I have yet to even harness it's full potential. I use it on everything now and can't speak highly enough of it. Many thanks to @bernhard

Yeah, I also love it and wouldn't want to miss it. Especially handy is the ddev snapshot feature. Has saved me a ton of time already.

  • Like 3
Link to comment
Share on other sites

3 hours ago, gebeer said:

Especially handy is the ddev snapshot feature. Has saved me a ton of time already.

How do you use it? I've never used it myself ? 

Glad it is helpful @FireWire ? 

Link to comment
Share on other sites

2 hours ago, bernhard said:

How do you use it? I've never used it myself

For example I have a website with a shop. And now I am developing a redesigned version which also has new fields and features. With snapshots I can go back and forth between the actual site (for example for bug fixing) and the relaunch site in seconds (or a minute). An import of the MySQL database would take hours because it is fairly large and has many thousands of inserts.

I am aware, that there might be other options like git worktree (or other techniques) and using a separate database for this kind of work.

  • Like 5
Link to comment
Share on other sites

12 hours ago, bernhard said:

How do you use it? I've never used it myself

What @dotneticsaid. And when trying out things. Snapshot - add fields/templates/dummy pages etc. Like it -> keep it. Don't like it -> rollback to last snapshot.

  • Like 4
Link to comment
Share on other sites

  • 1 month later...

Not sure if anyone of you ever ran into the issue that your .htaccess file redirects to a forced www-version. I use this in most of my setups way more lately.

There is the option to add additional hostnames to your DDEV config right away or later on from the CLI.

ddev config --additional-hostnames www.pwseo

You can update your .ddev/config.yaml manually as well - restart is needed in both cases.

name: pwseo
type: php
docroot: ""
php_version: "8.2"
webserver_type: apache-fpm
xdebug_enabled: false
additional_hostnames:
    - www.pwseo
additional_fqdns: []
database:
    type: mariadb
    version: "10.4"
use_dns_when_possible: true
composer_version: "2"
web_environment: []

This setup saves me headaches in terms of keeping different .htaccess files around for DEV/STAGE/PROD.

  • Like 3
Link to comment
Share on other sites

Now I have finally switched from Laragon (Windows) to DDEV (WSL2).
My first attempt some time ago was not very successful, but this time everything worked straight away.

However, I need a "special" feature for my project: 
sqlsrv/pdo_sqlsrv as a PHP extension for the connection to an MSSQL server

That was a minor challenge at first, then i found some older scripts on Stackoverflow, but unfortunately they no longer worked.

 

Here is a short guide on how to use sqlsrv (incl. msodbcsql18) with DDEV:

Create a "Dockerfile" and a "install_sqlsrv.sh" inside of the .ddev/web-build folder

Dockerfile:

ARG BASE_IMAGE
FROM $BASE_IMAGE

COPY install_sqlsrv.sh /var/tmp/
RUN apt-get update && DEBIAN_FRONTEND=noninteractive sudo apt-get install -y -o Dpkg::Options::="--force-confold" --no-install-recommends --no-install-suggests gcc make autoconf libc-dev pkg-config php-pear php-dev
RUN /bin/bash /var/tmp/install_sqlsrv.sh

install_sqlsrv.sh:

#!/usr/bin/env bash
export DEBIAN_FRONTEND=noninteractive
# Install sqlsrv drivers.
export PHP_VERSIONS="php8.1 php8.2 php8.3"
# Note: Only works for PHP 7.0+.
export PHP_SUFFIXES="8.1 8.2 8.3"


OS=$(grep -P '(?<=^ID=)' /etc/os-release | cut -c 4-)
VERSION=$(lsb_release -rs)


if [ ! -f /etc/apt/sources.list.d/mssql-release.list ]; then
  curl -fsSL https://packages.microsoft.com/keys/microsoft.asc | sudo gpg --dearmor -o /usr/share/keyrings/microsoft-prod.gpg
  curl https://packages.microsoft.com/config/debian/12/prod.list | sudo tee /etc/apt/sources.list.d/mssql-release.list
fi

sudo apt-get update
sudo apt -y update

for v in $PHP_VERSIONS; do
  sudo apt-get install -y "$v" "$v"-dev "$v"-xml
done

if [ ! -d /opt/microsoft ]; then
  sudo ACCEPT_EULA=Y apt -y install msodbcsql18 mssql-tools
  sudo apt -y install unixodbc-dev
  echo 'export PATH="$PATH:/opt/mssql-tools18/bin"' >>~/.bash_profile
  echo 'export PATH="$PATH:/opt/mssql-tools18/bin"' >>~/.bashrc
fi

for v in $PHP_SUFFIXES; do
  sudo pecl -d php_suffix="$v" install sqlsrv
  sudo pecl -d php_suffix="$v" install pdo_sqlsrv
  # This does not remove the extensions; it just removes the metadata that says
  # the extensions are installed.
  sudo pecl uninstall -r sqlsrv
  sudo pecl uninstall -r pdo_sqlsrv
done
for v in $PHP_SUFFIXES; do
  sudo bash -c "printf \"; priority=20\nextension=sqlsrv.so\n\" >/etc/php/\"$v\"/mods-available/sqlsrv.ini "
  sudo bash -c "printf \"; priority=30\nextension=pdo_sqlsrv.so\n\" >/etc/php/\"$v\"/mods-available/pdo_sqlsrv.ini "
  sudo bash -c "chmod 666 /etc/php/"$v"/mods-available/*sqlsrv*.ini"
done
sudo phpenmod sqlsrv pdo_sqlsrv

ddev start -> Aaaand BAM! Everything is working ? 

DDEV is simply great to use once you know how.

  • Like 3
Link to comment
Share on other sites

  • 3 weeks later...

@Stefanowitsch is using FTP to deploy his websites and ran into problems with DDEV (see here for details).

I didn't have these problems as I'm using automated deployments via RockMigrations + Github Actions, but I got this warning an every ddev start:

ETuK85w.png

Both issues are related as Randy explains here: https://github.com/ddev/ddev/issues/6381

The solution is to either disable mutagen or (imho better) to add this to your config:

upload_dirs:
  - site/assets/files

This should bring better performance, a startup without warnings and correct file permissions in /site/assets/files ? 

  • Like 3
  • Thanks 2
Link to comment
Share on other sites

  • 1 month later...

Can someone help me? 

Each time I create a new project with "ddev config" I have to adjust the then created config yaml file manually (php version, etc.).

I know there is a global ddev config file located here:

$HOME/.ddev/global_config.yaml 

As far as I understand I can add some configuration options there and each time I create a new project all the settings should be inherited from this global config file. Unfortunately that does not work, my additions just get ignored.

I know I can add the arguments to my initial ddev config call like this:

ddev config --php-version 8.3

But I have a few more additions and this command line is way to long just to type away. 

How do you handle this?

Link to comment
Share on other sites

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...