Jump to content

Working on a Webproject with PW


Bergy
 Share

Recommended Posts

Hello pw-community

My name is Alex and that is my first topic in this forum. I deal with Processwire now about one year and found a lot here without submitting any questions.

But now I have not found much that would answer my question respectively I didn't know what I should look for.

I have started a little web project on my PC at home using XAMPP. After building some pages, templates, fields and so on I decided to put the website online.

For that part I used ProcessExportProfile (and it worked great). OK, but after making some changes I've noticed how awkward it is to "copy" all fields, tags and changes I have made.

 

Now my question are:

Is there a better workflow to develop offline and transfer the changes to the online page ?

How do you all work on your web projects ?

What Tools do you use?  I'm using Intellij IDEA (with PHP plugin), sublime text, xampp, filezilla.

Do you work with git ?

 

I don't know if this is the right part of the forum but maybe someone has a advice or link for me.

(sorry for my English, if something is noticeable. I'm from Germany :-) )

Best regards

Alex

 

 

Link to comment
Share on other sites

Some modules to get you started

http://modules.processwire.com/modules/auto-export-templates-and-fields/
http://modules.processwire.com/modules/module-settings-import-export/
https://processwire.com/blog/posts/processwire-3.0.71-adds-new-core-module/

Edit:

As for my setup, I use PHPStorm (free for students, yay!), and a Ubuntu VM. Webpack for JS + PHPStorm's own file watchers for scss. I back up themes and modules I develop using git to GitLab private repos.

For syncing local & production server, you can set up a SSHFS and bootstrap remote PW into yours and export local changes to remote. I haven't tried this, and not sure if it'll work, but I'll give it a try and update you if it's worth trying, and probably write a tutorial on it.

Link to comment
Share on other sites

I use a dead simple workflow:

  • I use my own bash script to clone the database from production to local (and/or vice versa) via ssh. It takes only a few seconds. I just have to make sure all database related changes are performed on the production server. For testing purposes I occasionally make changes to the local database of course but I never sync any database data.
  • Files are synced with this great and fast tool: https://www.scootersoftware.com/download.php When set up properly, all differences of all files are detected somewhere between 20-60 seconds, depending on the server and its location on the globe. This tool is incredibly powerful. By regularly examining the differences I have also learnt a lot about under the hood file changes made by the system. It is also very handy when upgrading and debugging.
  • Like 1
Link to comment
Share on other sites

So you keep on editing your websites locally and then file sync with the website online ? Or just building a website locally ? I edit my sites online and keep daily incremental backups on my own harddisk. It got too much of a hassle for me to keep all my sites going in a local wamp.

Link to comment
Share on other sites

1 minute ago, pwired said:

So you keep on editing your websites locally and then file sync with the website online ?

This one.

2 minutes ago, pwired said:

It got too much of a hassle for me to keep all my sites going in a local wamp.

I do not have an extreme number to take care of. By working on local MAMP Pro clones I do not have to worry about breaking anything. Isn't it too risky to work on the production site without testing in advance?

  • Like 1
Link to comment
Share on other sites

I also would recommend to develop locally. I would go crazy if I had to wait a few seconds after every change. :lol:

My deploying workflow is currently:

  1. Track all files inside site/templates with Git and push/pull them on the remote server via SSH.
  2. Sync all files inside site/modules and site/assets/files via Sublime SFTP or Transmit.
  3. Sync the database with Sequel Pro.

But maybe some time I will try something fancy like a automated shell script. ;)

  • Like 2
  • Haha 1
Link to comment
Share on other sites

1 minute ago, abdus said:

If you're using PHPStorm, there's a sync feature that lets you diff between remote & local and sync differences automatically (other Jetbrains IDEs should have it too)

Yep, I used to do it but PHPStorm changed license which is too pricy for me. I hate to subscribe for apps, anyway. I switched to NeatBeans which has similar features but somewhat less powerful. As I mentioned, by using a visual diff tool I can perform a lot more. For example I can compare files with different names at will.  The system updater creates a htaccess file with something like "htaccess-3.0.76" (I cannot recall the exact name now...) and I can compare and sync it to .htaccess with a few clicks. This way I am also reminded to check the logs too, since those files are never in sync. Lots of cool stuff can be performed with a good diff tool, not just simple code management ;) 

Link to comment
Share on other sites

14 minutes ago, szabesz said:

Is there tutorial on the web which shows how it works? I would take a look a it... :)

Sorry if I may disappoint you, but by syncing with Sequel Pro I just meant: Export the local database and import it on the remote server. Nothing intelligent and of course it doesn't work on live sites where users made changes in the meantime. In that case I think the core import/export module would be the best option, but I haven't used it really yet. ;)

  • Like 1
Link to comment
Share on other sites

17 minutes ago, abdus said:

MySQL offers a similar tool for migration

Once I took a look at it, I cannot recall why I was not impressed ;) 

3 minutes ago, AndZyk said:

Sorry if I may disappoint you

Never mind :) I was just curious, anyway. My bash script is half baked but it does the trick already. When finished it will be a lot more powerful. I already have one for WP now I just have to finish this one for PW...

  • Like 1
Link to comment
Share on other sites

Quote

Isn't it too risky to work on the production site without testing in advance?

True. I always have to install a new copy and e.g. try out a new module first to see what is happening. Usually for a webshop.

Quote

I would go crazy if I had to wait a few seconds after every change.

True also but making changes only is not so much work. Besides the local database would always be different and syncing might overwrite something online.

  • Like 1
Link to comment
Share on other sites

11 hours ago, Bergy said:

Now my question are:

Is there a better workflow to develop offline and transfer the changes to the online page ?

How do you all work on your web projects ?

3

I used to work on localhost, but now since the Internet connection is much more reliable and fast, I do all development in web server directly. Localhost only for a backup purpose.

There are a lot of advantages doing that way; I can work from anywhere and at least make sure that the server configuration meets the client's server configuration. 

Link to comment
Share on other sites

I also use PHPStorm as main editor for developing locally, but didn't use any of its syncing feature.

About the process, I have a staging web-server with a local copy of the client's site where they can see the change on a daily basis. Once the site is edited (I take example on a site without SSH access), I backup the website and send it to the 'live' server trough Duplicator which save a lot of time when deploying the web-site multiple time after change has ben made; It take lower than a minute, depending on your bandwidth.

So the workflow is the following:

  1. Develop locally
  2. FTP the 'local' site to the 'staging' web-server with Duplicator.
  3. If the client need some changes, repeat step 1/2.
  4. The client is happy ?, FTP the site to the live server with Duplicator.
  5. The client have a request ? Backup and retrieve the live website locally, and repeat step from 1 to 4.

I must say that using Duplicator save A LOT of time! And even if I have SSH access, I use Duplicator.

To keep different version of a website, I use a Gitlab self-hosted server to keep track of the packages made with Duplicator.

 

12 hours ago, AndZyk said:

 

I also would recommend to develop locally. I would go crazy if I had to wait a few seconds after every change. :lol:

 

O0

 

@Bergy check this screencast on how easy is to deploy your modified website with Duplicator.

Edited by flydev
screencast link
Link to comment
Share on other sites

OK, only for the sake of clarity.

We need to :

  • transfer the template files | FTP
  • sync new fields, pages, tags and settings made locally and server side | import/export ???
  • sync the database | ???

Is there a tutorial for the import/export module ? Does it contains everything we need from PW except the database ?

Is there maybe a tutorial for sync the database?

This is just for the files , right ?

12 hours ago, abdus said:

If you're using PHPStorm, there's a sync feature that lets you diff between remote & local and sync differences automatically (other Jetbrains IDEs should have it too)

https://confluence.jetbrains.com/display/PhpStorm/Sync+changes+and+automatic+upload+to+a+deployment+server+in+PhpStorm

 

I would like to do all these steps manually and if i have understood what is going on there , I  would try to write a script to automate the process.

In my opinion it is a big disadvantage to work on a live server, which is nothing for me.

 

 

Link to comment
Share on other sites

9 minutes ago, flydev said:

So the workflow is the following:

  1. Develop locally
  2. FTP the 'local' site to the 'staging' web-server with Duplicator.
  3. If the client need some changes, repeat step 1/2.
  4. The client is happy ?, FTP the site to the live server with Duplicator.
  5. The client have a request ? Backup and retrieve the live website locally, and repeat step from 1 to 4

 

But the Duplicator doesn't sync the changes online , right? I'm looking for something that makes me able to do something like :

  1. Develop locally
  2. set the page offline
  3. Sync server changes with local changes
  4. Sync server db with local db
  5. export local page settings to the server
  6. export local db to the server
  7. set page online
Link to comment
Share on other sites

Right, the module doesn't sync changes. It deploy only a full copy of the current website.

If I remember correctly, @LostKobrakai made a module which "do some part of that". Let me check the forum and edit this post. @Bergy there is the module (not a simple one!) :

 

Edited by flydev
Migration Module
  • Like 1
Link to comment
Share on other sites

hi and welcome bergy,

the topic you are talking about is not an easy one... you can do a forum search (see my signature link) on keywords like "staging", "production", "synch" etc.

as always in the world of processwire you have free choice ;) personally i develop locally (i would strongly recommend you take a look on laragon: https://laragon.org/ . i replaced my xampp setup over a year ago and it's really great and fast and you can even put it on a usb drive and have a full dev server with virtual hosts and ssl certificates, mailcatcher etc.).

regarding the migrations / staging/dev topic: i develop locally and when i'm (almost) done i put it on the live server. from that time on i develop directly on the server with daily backups and via vscode remote ftp plugin. it has some drawbacks but also some advantages, like monchu already mentioned the quick-fix ability when you are abroad.

if you really need a proper dev/live staging system migrations might be the way to go. for simple changes you can also create/edit/remove fields and settings directly in your module so if you push an update to your live system you would also get the same changes to your database. in my projects it was easier to have a simple setup. in the rare case i needed bigger changes i talked to my client that he should not make any changes on day X and i changed their password after copying the live site to something else. of course, this would not work if you had user-input like form submissions or the like. that said, it always depends on the situation.

summed up i would say there are three levels of complexity and what i do (or would do) in those cases:

  1. simple website with very simple edits --> live edit via SSH / FTP
  2. more complex edits with no changes while developing --> backup, do changes locally, upload back to live server
  3. complex changes with live changes while developing --> custom migration scripts, migrations module
  • Like 3
Link to comment
Share on other sites

FTP sync is way too slow, just too many handshakes for every file. I zip the local site folder, upload it and unzip it on the server. Then empty the online database and import the local database. Clean up the cache and sessions and done! So what can this duplicator do more than this ?

Link to comment
Share on other sites

12 minutes ago, pwired said:

FTP sync is way too slow, just too many handshakes for every file.

That depends. Luckily PW is still relatively slim; 2248 files, although getting fattier each release... My diff tool can compare all files in 38-45 seconds over the ocean on a speedy VPS. In the same city it can perform even better; 18-20 secs. I can refresh individual directories afterwards to skip not changed files, so the whole process is fast enough to be able to work in this manner. I update the local database form production in about 5-10 seconds with my bash script. Mine is a dead simple workflow which is good for a solo developer. Teamwork would require more, of course.

Link to comment
Share on other sites

48 minutes ago, pwired said:

I zip the local site folder, upload it and unzip it on the server. Then empty the online database and import the local database. Clean up the cache and sessions and done! So what can this duplicator do more than this ?

Hi pwired, this is exactly what Duplicator do, no more or less about deploying the site. It just help to avoid those repetitive tasks.

  • Like 2
Link to comment
Share on other sites

Interesting topic:

I'm a bash scripter too. I usually have 2 remote webspaces: 1 htaccess-password protected dev environment and 2. a production environment. Both envs are exactly speced the same. 

At the beginning the production environment is the development-environment and is password-protected. Once the first release of the website is published, the htaccess protection will be deleted and a clone of the production website via bash script is going to be deployed to the development environment.

Both environments authenticate passwordless against each other via RSA public key through SSH.

The scripts do an incremental file backup and sync and a database backup and sync. dev can be deployed to prod and vice versa with a single command in bash. (or via putty under windows)

Updating Processwire to a new version is also done with a bash script, first in dev, afterwards, if all runs smoothly, i run the deployment script dev -> prod.

My  IDE (Netbeans) is connected via SFTP to the dev environment.

Bottom line is i need 3 scripts:

deploy-dev-prod.sh

deploy-prod-dev.sh

update-pw.sh

This workflow is a bit static though, if e.g the content of the production site is changing heavily DURING development of the development site (where structural changes take place, like changes in the pages hierarchy, new content structures or others) problems arise.

 

 

  • Like 1
Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
 Share

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...