Jump to content
Chris

your deployment process

Recommended Posts

hi, do you guys use some special deployment tools or do use interesting workflows for deployment you like to talk about? :)

actually i do it the olfd fashioned way: i'm developing on a local xampp and when i'm finished i upload the ftp-data to the webspace and import the database manualy with phpmyadmin.

but i like to evolve my workflow to a more flexibel deployment process. for instance it woud be nice to develope on my local xampp but to be able to syncronize to a dev.cooldomain.com website all my ftp and mysql data with a button click. i also like the idea to syncronize with an php-tool between two seperate folders and database on the webserver the live and the dev. version of a site, to test new things and show them to people etc. 

Share this post


Link to post
Share on other sites

Much has been discussed about this during the time I was with modx evo and on their forums. FTP and mysql synctools are everywhere available such as Unison, Instantsyncftp, batchsyncftp, winscp - or - navicat, heidisql, sqlbuddy, etc. etc. etc. However, the problem is the database. You see - there are paths in the on-line database that simply depends on the server hosting your live website that are different from your local database.

Syncing the database between your local laptop and the on-line database can result in messing up these paths and result in disaster.

Besides that, it is not only you who is updating the database when for example editing a website. The on-line database updates itself also with for example the CACHE of the on-line website that totally depends on the visitors of your website.

Or for example a blog, users who subscribe to your website, counters, people who upload stuff to your website, etc. etc. Sooner or later the on-line database will be always out of sync with your local database on your laptop. If you sync then between them then data will get lost or overwritten on the on-line database.

While editing on your local website to cut and paste between your localhost install and the remote might not be as efficient, but it's much more disaster-proof. Only sync static files such as html, css and js. Try to keep them in 1 folder so you can easy update them with simple ftp between only 2 folders.

Share this post


Link to post
Share on other sites

I work locally using MAMP Pro (OS X). The 'problem' for me has been getting a development version in front of a client. I can't use port-forwarding and DynDNS at all and I'm a command line & GIT noob. But I recently discovered Pagekite which lets you tunnel your localhost to a publicly available URL. With a bit of tweaking on the MAMP Pro side of things I can serve a locally running site to a client for, not only review, but for them to add content. Example.

The last bit of the process is uploading the site files and moving the database when going live. If anyone is interested in more details of my setup let me know.

  • Like 6

Share this post


Link to post
Share on other sites

This won't help with the database, but have you considered using git?  For my workflow I develop on XAMPP and I use git.  It allows me to easily work between my laptop and desktop.  And when I am ready to show the site to a client, I log into my server do a git pull and then I update the database. 

  • Like 1

Share this post


Link to post
Share on other sites

This won't help with the database, but have you considered using git?  For my workflow I develop on XAMPP and I use git.  It allows me to easily work between my laptop and desktop.  And when I am ready to show the site to a client, I log into my server do a git pull and then I update the database. 

I want/need to do this also, but I haven't learned GIT well enough yet.

So far, as the sole contributor to my projects, it works wonderfully. And I see the great potential to using it in a team.

Sadly, my host has git enabled, but my pulls are not working.

Something off with the host I guess. So for the meantime, I work entirely offline and then FTP (slowly).

Share this post


Link to post
Share on other sites

You see - there are paths in the on-line database that simply depends on the server hosting your live website that are different from your local database.

Syncing the database between your local laptop and the on-line database can result in messing up these paths and result in disaster.

Why? Almost everything can be relative to document root or the current page and arranged the same way locally and on the server. If something really must be in different places, manage that path info from one central spot. You could put those in /site/config.php as long as you are careful not to use any system names. People have their own ways of doing things but personally I think it's well worth the minimal effort to keep data and files as portable as possible.

I haven't been using PW very long but I haven't seen anything turn up that would be a problem. If I'm missing something please correct me. I wouldn't want to get caught out by that.

  • Like 1

Share this post


Link to post
Share on other sites

I use git in conjunction with git-ftp (https://github.com/git-ftp/git-ftp)

Instead of transferring the whole project, I thought, why not only transfer the files that changed since the last time, git can tell me those files. Even if you are playing with different branches, git-ftp knows which files are different. No ordinary FTP client can do that.

It's awesome and very convenient.

  • Like 1

Share this post


Link to post
Share on other sites

If working on my own personal projects I develop locally on MAMP and then push everything live to the server once done.  I don't use Git as much as I should, I need to work on that.

For client projects, I have a random domain name that I develop client projects on, I just make sure they're not indexed by any search engines and only the client has the url to watch the site build and progress.  Once I'm finished and they're happy and I've been paid I then migrate it to their server.  This has worked well for me over the years and keeps clients happy as they can track the progress of the build. 

  • Like 5

Share this post


Link to post
Share on other sites

For client projects, I have a random domain name that I develop client projects on, I just make sure they're not indexed by any search engines and only the client has the url to watch the site build and progress.  Once I'm finished and they're happy and I've been paid I then migrate it to their server.  This has worked well for me over the years and keeps clients happy as they can track the progress of the build. 

Very interesting method, thanks for sharing!

Share this post


Link to post
Share on other sites

Have there been any new developments in this topic?

I've gotten better with git - merging, push/pull and keeping various branches on my dev station while the master here and there are synced.

Share this post


Link to post
Share on other sites

My development workflow / deployment process is git- and beastalkapp-based, and heavily inspired by this video by Chris Coyer from css-tricks.com: http://css-tricks.com/video-screencasts/109-getting-off-ftp-and-onto-git-deployment-with-beanstalk/

Meaning:

  • Working locally
  • but having a staging server, more or less like the one einsteinsboi has described above
  • and having a production server, the clients webspace

Beanstalk handles the deployments which tend to be automatically on staging (just the master branch) and - always - manually on production.

  • Like 2

Share this post


Link to post
Share on other sites

do you guys use some special deployment tools or do use interesting workflows for deployment you like to talk about? :)

always start out with a prototype which usually includes 4-6 example pages in plain old HTML, hopefully spanning all the layouts needed in a project. After that, I prefer to work on the actual server which the site will be hosted on, maybe on a subdomain or different folder if it's a relaunch. The reason for this is that in my humble experience, development setups almost never match the setup of the client's server/webspace. There's always some kind of small difference, and I really don't want to waste time on replicating that in a local dev environment.

The prototype is usually based on a small starter project (I wouldn't call it a framework) I have built over the past couple of months. This includes a Grunt build script (which I love) including a lot of useful helpers like Sass, Bower, various tools to optmize images, Modernizr, jQuery etc. This also takes care of combining and minifying CSS and JS files, and it can even work with the PW templates I create after finishing the prototype.

As for the actual deployment, I haven't included that in the build script. There are Grunt plugins for that, but I'm not entirely sure that would be very secure. So I still use plain old (S)FTP and ssh (if available on the client's server, which unfortunately still is rarely the case) to deploy the actual templates to the (development) site.

  • Like 3

Share this post


Link to post
Share on other sites

Does anyone know of a "portable" solution? Something that encapsulates the whole Apache/PHP/MySQL, so that I can move to another computer and just open again the my text editor, start the server and keep on developing?

I read that I could do this with Vagrant and Docker but I´m not quite sure if I´m understanding correctly.

Share this post


Link to post
Share on other sites

portable web server. with PW... bitnami?

Share this post


Link to post
Share on other sites

vagrant (+ virtualbox) + git is pretty encapsulated :)

vagrant streamlines the process of generating virtual machines to develop on, by providing 'prepackaged' virtual images + options to include other provisioning software (like puppet, chef, etc.). Some use Vagrant VMs as 'apps', meaning that each project has its own virtual machine, some other people have one VM for multiple sites (a la *AMP software).

In your case, you could have your site versioned in git, together with your mysql dump and Vagrantfile (vagrant 'recipe' for your vm), and when you come to new machine, you'd do something like 'git clone project && cd project && vagrant up', go have a coffee, come back and start working.

(of course, it's never quite that simple IRL :D, but in gist vagrant is quite nice. I am in the process of setting up local vagrant from clean ubuntu + serverpilot.io over vpn. fun times)

  • Like 3

Share this post


Link to post
Share on other sites

Then what OS are you using?

Geez I'm sorry, OS X, sorry for the stupid omission.

Share this post


Link to post
Share on other sites

A vm workflow is really nice especially cross platform. If you take a look at vagrant I'd suggest trying out ansible as provisioner. It's a damn nice and modular system, which can be used on almost any system you've ssh or console access.

  • Like 2

Share this post


Link to post
Share on other sites

 ... If you take a look at vagrant I'd suggest trying out ansible as provisioner ... 

I took a serious try at Vagrant. Really, I did.

Spent a week trying its various plugins, hostupdater and some other useful ones I read about. In the end, it just complicated my workflow.

I have as many as 8 localhost sites in various stages on my machine. I use only two hosts (nearlyfreespeech and digitalocean). I develop on Debian. Git repos for all projects currently saved to bitbucket but soon may be moving to gitlab :) 

I really did not want a new instance of vagrant running for each project, costing me several gigs of space.

Otherwise, I totally loved the experience. I think it's a pretty amazing concept! I went with Scotch Box and liked having everything set up :) 

  • Like 1

Share this post


Link to post
Share on other sites

I should really do local and then git to live buuuuuuut, on my solo projects I:

- work on a staging server (same type of server as live)

- export and install on live

- keep the staging version for template edits that can be copied across keeping the templates in sync but not the content

Share this post


Link to post
Share on other sites

I took a serious try at Vagrant. Really, I did.

Spent a week trying its various plugins, hostupdater and some other useful ones I read about. In the end, it just complicated my workflow.

I have as many as 8 localhost sites in various stages on my machine. I use only two hosts (nearlyfreespeech and digitalocean). I develop on Debian. Git repos for all projects currently saved to bitbucket but soon may be moving to gitlab :)

I really did not want a new instance of vagrant running for each project, costing me several gigs of space.

Otherwise, I totally loved the experience. I think it's a pretty amazing concept! I went with Scotch Box and liked having everything set up :)

That NearlyFreeSpeech looks really nice! I might try them for my personal website to save some bucks!

  • Like 1

Share this post


Link to post
Share on other sites

One thing I'm going to be working on soon is adapting Trellis to ProcessWire.

If any of you saw my video on adapting Sage for ProcessWire, Trellis is a project by the same group of people (Roots) which allows for you to setup an excellent development + staging + production environment using Ansible (and Vagrant for the development environment).  Because of the similarities between WordPress and ProcessWire (well, in terms of the server stack required and the fact that they are both CMSes), a lot can be borrowed from their setup in terms of approach and techniques.  They use WP-CLI for some things, but Wireshell can be swapped out for that nicely from what I've researched.

I'm really excited to see how it turns out because this has been a huge missing piece to my workflow.  This approach would replace things like Capistrano and any other deployment methods, assuming you use a dedicated VM for a site.

I highly recommend checking out Trellis to see how it's done.  It's thought out very well.

  • Like 4

Share this post


Link to post
Share on other sites

  • Recently Browsing   0 members

    No registered users viewing this page.

  • Similar Content

    • By gmclelland
      Just thought I would share this cool new chrome/firefox addon for https://insight.io that I discovered.
      It allows you to quickly search or view the folder/file hierarchy of a repository inside of github.

      It also allows you to list the methods of a class

      It's very similar to https://chrome.google.com/webstore/detail/octotree/bkhaagjahfmjljalopjnoealnfndnagc, but provides more features.
      On the github repo page it shows a insight.io button where you can view the current file on github that you are looking at on insight.io for more features like where specific functions are referenced.
      Note: I'm not affiliated with that company.  I just thought it was helpful.
       
    • By rolisx
      Hi Guys,
      Just finished a website locally and wanted to upload it on the webserver of my customer. I got a server 500 error. Now, the guidelines of the hoster (world4you.com) does not allow "Options" in the htaccess-file. So, when I uncomment these:
      Options -Indexes
      Options +FollowSymLinks
      the site is visible, but the content won't show and no links are available. Not sure if I need the Symlinks-part but I guess I need a workaround for the Index-part. Can anybody help here? I need the website up and running asap....
      Thanks!
      Roli
    • By dst81
      Hi,
      I'm a System Administrator responsible for DevOps in our Company. We're trying PW in a single client project atm and experience some glitches with DevOps-/Workflow.
      We have a small team of developers that need to work with the same code base. They all need to be able to develop locally and deploy to a preview/staging environment.
      Our Toolstack contains git for versioning, chef/vagrant or docker for local development/testing and Jenkins for building assets and automatic deployment to the staging-site.
      There's several challenges / glitches in this process that makes me think that ProcessWire hadn't been developed for a use case like ours and is much more intended to be used by single developers that work right on the production system.
      Can you advise me on a suitable workflow?

      There's problems with the assets/files dir that must be shared between the staging website and local environments of our developers.
      We're right now working with symlinks on the staging system that helps to preserve the direcory when deploying from the master branch. but now we tend to use nfs-shares so devs can collaborate with a shared directory.

      The local docker containers can use the same target (the nfs) from inside the containers. But is that the way it needs to be done? Really?
      There's so much work that needs to be done to fit ProcessWire in a DevOps Workflow that we tend to decide to switch to another CMS.
       
      Any suggestions or hints that i might have missed. Am I wrong or is PW really not meant to be used this way. I
    • By jploch
      Hi folks!
      For a website Iam working on I need to (pre)load a huge amount of images (100-500) from a folder in assets (wich I upload via FTP).
      To preload them I want to add them to the DOM inside a container, that I hide with css.
      This images will be use for a frame by frame animation (that animates with scrolling) so they should be loaded parallel and if the user clicks a cancel button, the loading should be canceled. (My website is using ajax to load pages with different animations, and the loading of the second animation waits till the loading of the first animation is loaded completly, wich I want to prevent). 

      I want to use ajax to do this, so I can cancel the loading with xhr.abort();
      Here is my code:
      var folder = '{$config->urls->assets}sequenzen/test/'; xhr = $.ajax({ url : folder, success: function (data) { $(data).find("a").attr("href", function (i, val) { if( val.match(/\.(jpe?g|png|gif)$/) ) { $(".preloader").append( "<img src='"+ folder + val +"'>" ); } }); } }); this will give me a 403 forbidden error.
      After some research I found out that I have to put a .htaccess in my assets folder.
      I also tried putting it in the sub folder "test", where the files are, but Iam still getting the error.

      Is there anything else Iam missing? Is there a configuration in PW i have to change to do that?
    • By Richard Jedlička
      Tense    
      Tense (Test ENvironment Setup & Execution) is a command-line tool to easily run tests agains multiple versions of ProcessWire CMF.
      Are you building a module, or a template and you need to make sure it works in all supported ProcessWire versions? Then Tense is exactly what you need. Write the tests in any testing framework, tell Tense which ProcessWire versions you are interested in and it will do the rest for you.

      See example or see usage in a real project.
      How to use?
      1. Install it: 
      composer global require uiii/tense 2. Create tense.yml config:
      tense init 3. Run it:
      tense run  
      For detailed instructions see Github page: https://github.com/uiii/tense
       
      This is made possible thanks to the great wireshell tool by @justb3a, @marcus and others.
       
      What do you think about it? Do you find it useful? Do you have some idea? Did you find some bug? Tell me you opinion. Write it here or in the issue tracker.
×
×
  • Create New...