Jump to content
flydev 👊🏻

☁️ Duplicator: Backup and move sites

Recommended Posts

Thanks @flydev 👊🏻 for your hard work on this. I installed on a Debian server and had a few issues which I have managed to fix. I submitted a bit of a messed up PR - I think we were both working at the same time.

All my tests were in native mode

1) The server must have the "zip" package installed - mine didn't which caused me some grief for a while. I suggest looking for this and warning if not installed.

2) I had to change the mysqldump command quite a bit - you can see in the PR, but key things were --single-transaction and specifying the DB and removing the --skip-lock-tables 

3) Had to chmod the .sh script to 744 so it was executable, but obviously this may not work on all servers depending on the owner etc.

4) Had to change the directory and the exec call to actually execute the .sh file - it was failing to run otherwise.

With those changes everything seems to work great, although I did notice that after running a duplication via the Process module, it no longer reloads the page when done so you don't see the new package unless you manually reload when it's finished.

Thanks again!

 

  • Like 2
  • Thanks 1

Share this post


Link to post
Share on other sites

Thanks you very much for this report @adrian

1 hour ago, adrian said:

1) The server must have the "zip" package installed - mine didn't which caused me some grief for a while. I suggest looking for this and warning if not installed.

Already on the todo list.

1 hour ago, adrian said:

2) I had to change the mysqldump command quite a bit - you can see in the PR, but key things were --single-transaction and specifying the DB and removing the --skip-lock-tables 

I will check the PR after this message, but this can depend on the user end database schema used. We will have on the next version multitude options to customize the behavior of mysqldump. I was thinking too to give a textbox to enter some custom swicth for the command-line before proceeding.

1 hour ago, adrian said:

3) Had to chmod the .sh script to 744 so it was executable, but obviously this may not work on all servers depending on the owner etc.

Thanks, I will make some checks available to the destination dir of the script.

1 hour ago, adrian said:

4) Had to change the directory and the exec call to actually execute the .sh file - it was failing to run otherwise.

I am sure this is because you have safe_mode enabled, if you can confirm me that, it would be awesome.

  

1 hour ago, adrian said:

With those changes everything seems to work great, although I did notice that after running a duplication via the Process module, it no longer reloads the page when done so you don't see the new package unless you manually reload when it's finished.

Yes it's for debugging purpose and to avoid the page reloading - just set the var $debug to false in ProcessDuplicator.js, 

 

Thanks again, going to check the PR.

Edited by flydev 👊🏻
last answer
  • Like 2

Share this post


Link to post
Share on other sites

Just another thing, the next minor version will contain an encryption setting, trying to make Duplicator a bit GDPR compliant.

  • Like 3

Share this post


Link to post
Share on other sites
4 hours ago, flydev 👊🏻 said:

I am sure this is because you have safe_mode enabled, if you can confirm me that, it would be awesome.

Are you talking about PHP's "safe_mode" ? I am thinking not because it hasn't been available since php 5.4. Sorry is there another safe_mode setting I'm not thinking about?

  • Like 1

Share this post


Link to post
Share on other sites

@flydev 👊🏻 @adrian

Thanks for all your had work on this invaluable module.

I've updated a few sites to the latest Duplicator version. But have been running into an issue when I have AWS Backups enabled.

When AWS Backups is enabled and going to Setup -> Duplicator, I get a Call to a member function getTimestamp() on bool Error.

See attached screenshot.

The site core and modules are all up to date as of today (Jan 14, 2020).

Using php 7.3, got the same error w/ php 7.2.

Amazon libraries installed w/ Composer.

Screen Shot 2020-01-14 at 2.11.26 PM.png

  • Thanks 1

Share this post


Link to post
Share on other sites

I just tested it and the process ended without error 🤦🏻‍♂️

756401006_Annotation2020-01-15005128.png.fd7ab0d0ded9f69d833b7630497a63c9.png

 

What's the version of your Duplicator ?

Edit1:

Ok got it. Will push a fix.

Edit2: 

You can update the module to the version 1.3.15

Edited by flydev 👊🏻
fix - new version
  • Like 1

Share this post


Link to post
Share on other sites

Hi Flydev,

Fantastic, thanks for the quick fix.

I'm no longer getting the error.

However, I do see an invalid timestamp display in the Created column when viewing the list of backups for my AWS backups.

 

Screen Shot 2020-01-14 at 6.40.31 PM.png

  • Like 1

Share this post


Link to post
Share on other sites
15 hours ago, GhostRider said:

However, I do see an invalid timestamp display in the Created column when viewing the list of backups for my AWS backups.

I fixed the issue (thanks for reporting it) so you could access the process module again, but I don't have idea at this moment where is or what's the error.

Could you try to make others AWS backups with different website and post the result ? Meanwhile I will re-read the code to see if I can catch something.

  • Like 2

Share this post


Link to post
Share on other sites

Screenshot added of another site giving the same "Invalid Timestamp" under the created column.

The "Invalid Timestamp" message is only for the AWS backups.

It's not really an issue as the database name has a timestamp.

Everything else is working great.

Screen Shot 2020-01-15 at 7.27.44 PM.png

  • Thanks 1

Share this post


Link to post
Share on other sites

Hi Flydev,

I have been thinking about one feature request if you are taking any. I don't want to add more work, but perhaps something to think about.

On the Duplicator settings page, I have used the "Files and folders excluded from" checkbox (very handy) and also the "Custom excluded files and paths'.

I'm wondering if there is any way of excluding any custom file directories, rather than file extensions?

Here are a few examples:

1. After adding the AWS library via composer, Duplicator is backing up the Composer directory called vendor (at the same level as wire, files etc...). This file has become quite large over time.

2. We also at times create directories at the same level for legacy files when building a new client sites, where the legacy files need to still be accessible. Our latest client had a very large media folder nearly 7GB. ie /media/

3. When using ProCache, Proche adds two folders in the assets directory called ProCache-###### and pwpc. The ProCache-#### file is a collection of all the static versions of the pages on the site, which can also become quite large depending on the site.

With most of these directories, we do have a good idea what file extensions are being used and can exclude them in the 'Custom excluded files and paths' section.

However, with the Composer vendor directory, there are so many file extensions being used with a very deep nested group of folders. This makes for a very large list of possibilities to exclude.

With a few of our sites, our backups went from ~90MB to over 700MB.

 

It would be great to hear your thoughts on hiding directories/folders when you have time.

Below is a list of some of the file paths we struggle with.

 

/vendor or any custom directory at this level

/site/assets/ProCache-###

/site/assets/pwpw/*

 

Thanks

  • Like 2

Share this post


Link to post
Share on other sites

@GhostRider, what's wrong with the 'Custom excluded files and paths' box? To exclude a path simply enter it without a file pattern like:

/site/assets/pwpw/
/vendorOrAnyCustomDirectoryAtThisLevel/

To exclude the ProCache stuff using a regular expression should do:

%/ProCache-[^/]+$%  # Ignore ProCache directory

To exclude image variants you may try this:

%\.\d+x\d+\.[^/]+$%  # Ignore variants created by PW

  • Like 2

Share this post


Link to post
Share on other sites

My apologies, this was user error on my part.

I was using the full server paths, rather than the site path.

/home/accountname/public_html/site/vendor/ rather than /site/vendor/

@Autofahrn thanks for your post.

 

  • Like 2

Share this post


Link to post
Share on other sites
8 hours ago, GhostRider said:

I have been thinking about one feature request if you are taking any. I don't want to add more work, but perhaps something to think about.

No problem, as Andreas said, everything you requested is already implemented. If you have other features in mind, do not hesitate guys 💪

Also, with the next version, you will be able to backup the big site with Duplicator. I can make backup of a 1.6GO website without issue. It can be already tested - it's on the dev branch on Github.


 

Spoiler

Meanwhile, I am developing the documentation site of Duplicator under ProcessWire and VueJS. Note aside, this profile will be available for the community..

dup-Annot.thumb.png.14fea43120153157e373ef1ce96ac103.png

 

 

Edited by flydev 👊🏻
Infos
  • Like 4

Share this post


Link to post
Share on other sites

UPDATE: Solved. I had gulp and the browsersync node module running while I packaged up which added a weird script in the installer.php page. I found it and removed it and re-ran. All good now.

When I upload the package and installer and go to the installer.php file on the server i get this error:

Parse error: syntax error, unexpected '__bs_script__' (T_STRING), expecting ',' or ';' in /...n/installer.php on line 334

Edited by shogun
SOLVED
  • Like 1

Share this post


Link to post
Share on other sites

I love this module! I'm wondering though, how to make this easier as we build out a site's cms.

Is there a way for when I'm working locally and I create new fields or have some new CMS content updates locally, is there a way to sync up or easily push this to the cloned remote site's database without having to create a whole new backup, upload, create db all over again on the second server?

  • Like 1

Share this post


Link to post
Share on other sites
9 hours ago, shogun said:

Is there a way for when I'm working locally and I create new fields or have some new CMS content updates locally, is there a way to sync up or easily push this to the cloned remote site's database without having to create a whole new backup, upload, create db all over again on the second server?

Have you tried the Migrations module? It's a bit of overhead to create migrations classes instead of using the admin UI, but it's nice to have reproducible template and field changes across all environments.

Then you'd only need to mirror or sync /sites/assets/files/ and /site/templates/ somehow.

 

  • Like 2

Share this post


Link to post
Share on other sites

Hi Guys,

I'm trying to get Dublicator up and running as a automated backup solution. While it is working as expected if I trigger it manually, I'm still figuring out the automation. But this will come 🙂

Anyway. I wanted to report that if you delete packages via the Package Manager, the corresponding logs do not get deleted. This results in Packages shown in the summary on the config page. (see attachment). This might be a wanted behaviour but confused me. 🙂 This could be fixed by deleting the logs manually via FTP.

Cheers Sascha

2020-02-14_10-16-23.png

  • Thanks 1

Share this post


Link to post
Share on other sites

HI @saschapi 

Thanks for reporting this issue, I also noticed it and fixed it in the dev version. 

The next stable version (which come with native backups) should be pushed next week.

  

On 2/14/2020 at 10:23 AM, saschapi said:

I'm still figuring out the automation.

Just make a scheduled task which point to the CRON Duplicator's helper file. If you need further help, just ask it here 🙂

 

On 1/29/2020 at 3:22 AM, shogun said:

I love this module! I'm wondering though, how to make this easier as we build out a site's cms.

Hello @shogun ,  I have the same requirement and I am using what @d'Hinnisdaël suggested. 

I use Migration with a self made GUI and Duplicator for daily backups.
 

Edited by flydev 👊🏻
more answer.
  • Like 2

Share this post


Link to post
Share on other sites
On 2/17/2020 at 7:14 PM, flydev 👊🏻 said:

The next stable version (which come with native backups) should be pushed next week.

 

Any news on the new stable version? Currently i'm getting this error when I'm trying to activate the module. 

p.png?fv_content=true&size_mode=5

  • Thanks 1

Share this post


Link to post
Share on other sites

Hi @B3ta

I had too much work this week so its planned for the next tuesday.

Can you upload again the screenshot please ? I can't see it. Thanks you.

  • Like 1

Share this post


Link to post
Share on other sites

Thank you for the work you put into building this plugin and for the great news!  

23 hours ago, flydev 👊🏻 said:

Can you upload again the screenshot please?

Here is the error i'm getting once I try to activate the dropbox backup option. 

N8UTC3LR.png

  • Thanks 1

Share this post


Link to post
Share on other sites

@B3ta

I don't have the same code on this line 32, look at the repo there https://github.com/flydev-fr/Duplicator/blob/v1.3.14/Classes/DupLogs.php#L32

 

I tested Dropbox before posting and its working (even if it look like there is a small issue with the timestamp).

1450677596_Capturedecran2020-03-02a11_02_22.png.97526a6c53a3dbdfaab2be0efa8b9778.png

FYI, the latest version is 1.3.14 and you should update your Duplicator I think, let me know 😅

Share this post


Link to post
Share on other sites

Hello,
I have and issue with php 7.4 about a deprecated function get_magic_quotes_gpc() used inside the installer.php...any quick fix ? Thanks in advance

Schermata 2020-04-01 alle 11.06.29.jpg

  • Like 1

Share this post


Link to post
Share on other sites

Since get_magic_quotes_gpc returns FALSE from PHP 5.4, the installer line

$value = get_magic_quotes_gpc() ? stripslashes($_POST[$field]) : $_POST[$field];

safely could be replaced with

$value = $_POST[$field];

(assuming you are not installing on a legacy platform)

 

 

  • Like 2

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

  • Recently Browsing   0 members

    No registered users viewing this page.

  • Similar Content

    • By David Karich
      ProcessWire InputfieldRepeaterMatrixDuplicate
      Thanks to the great ProModule "RepeaterMatrix" I have the possibility to create complex repeater items. With it I have created a quite powerful page builder. Many different content modules, with many more possible design options. The RepeaterMatrix module supports the cloning of items, but only within the same page. Now I often have the case that very design-intensive pages and items are created. If you want to use a content module on a different page (e.g. in the same design), you have to rebuild each item manually every time.
      This module extends the commercial ProModule "RepeaterMatrix" by the function to duplicate repeater items from one page to another page. The condition is that the target field is the same matrix field from which the item is duplicated. This module is currently understood as proof of concept. There are a few limitations that need to be considered. The intention of the module is that this functionality is integrated into the core of RepeaterMatrix and does not require an extra module.
      Check out the screencast
      What the module can do
      Duplicate a repeater item from one page to another No matter how complex the item is Full support for file and image fields Multilingual support Support of Min and Max settings Live synchronization of clipboard between multiple browser tabs. Copy an item and simply switch the browser tab to the target page and you will immediately see the past button Support of multiple RepeaterMatrix fields on one page Configurable which roles and fields are excluded Duplicated items are automatically pasted to the end of the target field and set to hidden status so that changes are not directly published Automatic clipboard update when other items are picked Automatically removes old clipboard data if it is not pasted within 6 hours Delete clipboard itself by clicking the selected item again Benefit: unbelievably fast workflow and content replication What the module can't do
      Before an item can be duplicated in its current version, the source page must be saved. This means that if you make changes to an item and copy this, the old saved state will be duplicated Dynamic loading is currently not possible. Means no AJAX. When pasting, the target page is saved completely No support for nested repeater items. Currently only first level items can be duplicated. Means a repeater field in a repeater field cannot be duplicated. Workaround: simply duplicate the parent item Dynamic reloading and adding of repeater items cannot be registered. Several interfaces and events from the core are missing. The initialization occurs only once after the page load event Changelog
      1.0.4
      Bug fix: Various bug fixes and improvements in live synchronization Bug fix: Items are no longer inserted when the normal save button is clicked. Only when the past button is explicitly clicked Feature: Support of multiple repeater fields in one page Feature: Support of repeater Min/Max settings Feature: Configurable roles and fields Enhancement: Improved clipboard management Enhancement: Documentation improvement Enhancement: Corrected few typos #1 1.0.3
      Feature: Live synchronization Enhancement: Load the module only in the backend Enhancement: Documentation improvement 1.0.2
      Bug fix: Various bug fixes and improvements in JS functions Enhancement: Documentation improvement Enhancement: Corrected few typos 1.0.1
      Bug fix: Various bug fixes and improvements in the duplication process 1.0.0
      Initial release Support this module
      If this module is useful for you, I am very thankful for your small donation: Donate 5,- Euro (via PayPal – or an amount of your choice. Thank you!)
      Download this module
      > Github: https://github.com/FlipZoomMedia/InputfieldRepeaterMatrixDuplicate
      > PW module directory: https://modules.processwire.com/modules/inputfield-repeater-matrix-duplicate/
    • By jaro
      This module (github) does with site/assets/files what Ryan's DatabaseBackups module does with the database:
      Backup site/assets Download ZIP archive Upload ZIP archive Restore site/assets Motivation: This module can be the missing part for projects with content backup responsibility on the client's side: The client will be able to download DB and assets/files snapshots through the backend without filesystem access, thus backing up all content themselves.
      Release state alpha – do not use in production environments.
      Credits for the nice UI go to @ryan – I reused most of it and some other code from the DatabaseBackups module.
    • By NorbertH
      Is there a hook to do something right after cloning a page ?
      I want the page to be saved right after cloning it either from the button in the tree or from a lister, because saving the page triggers several calculations that are not triggered by just cloning the page.
       
      Thanks in advance !
×
×
  • Create New...