Jump to content
flydev

Duplicator: Backup and move sites

Recommended Posts

@flydev the clear log button will be helpful, thanks. I have one request: when upgrading to the latest version, I have to manually move the SDK files from the old folder (starting with a dot: ".Duplicator") to the new, updated one. Is there a way to automate this task? It's easy to do manually but I have to remember to do it each time I upgrade.

Share this post


Link to post
Share on other sites

I see, its not possible with ProcessWireUpgrade as the SDKs are not bundled with the module. I will try to find a workaround, like a button to import the SDKs from a previous duplicator version. If you have a better idea, I am open :)

Share this post


Link to post
Share on other sites

@flydev I have installed 1.1.7 now.

Not sure yet if it works (set to lazyCron, every hour).

Meanwhile, I installed GoogleDrive SDK via Composer. After reloading the module config page, I can't click on GoogleDrive option. Where do I put the vendor folder from Composer? And why is https://github.com/flydev-fr/duplicator-google-drive an error 404 page?

  • Like 1

Share this post


Link to post
Share on other sites
35 minutes ago, flydev said:

like a button to import the SDKs from a previous duplicator version.

+1 Along with a hard to miss notification maybe?

Share this post


Link to post
Share on other sites
32 minutes ago, dragan said:

Not sure yet if it works (set to lazyCron, every hour).

It will, after you visit a page, in an hour. (if you wanna try LazyCron, set it to "LogoutTrigger" and logout from the admin then log-in again and check your packages).

 

32 minutes ago, dragan said:

Where do I put the vendor folder from Composer?

How are you running composer ? On localhost and uploading the vendor folder; or directly on the server ?

The vendor folder should be along side the "wire" and "site" folder.

5a2fea06f32a2_ftptree.png.8efa13afc5a2f9f7578231d41640c310.png

 

32 minutes ago, dragan said:

The right repo is there :  https://github.com/flydev-fr/Duplicator-SDKs

 

Edited by flydev
screenshot

Share this post


Link to post
Share on other sites

@flydev Yes, I try it out locally first. So: vendor, site + wire are all on the same level?

The local backup (triggered now twice with logout) didn't work. See screenshot of current module config.

duplicator-module-config.png

Share this post


Link to post
Share on other sites

Big thanks to @flydev - he offered to help via TeamViewer.

However, when I installed the latest version on the remote server, it didn't work. I set it to 30 seconds, with LazyCron, and nothing happened. I clicked around several pages in FE + BE... Only via manually triggering a backup via package manager did I get a backup.

And even then, the entire site was frozen. You couldn't do anything (front- or backend), in the time it was creating the backup. I understand such a process takes lots of resources, but is this expected behavior?

  • Like 1

Share this post


Link to post
Share on other sites
2 hours ago, dragan said:

And even then, the entire site was frozen. You couldn't do anything (front- or backend), in the time it was creating the backup. I understand such a process takes lots of resources, but is this expected behavior?

if that can reassure you, what you are experiencing here is normal and it has to do with php session locking mechanism.

To be sure, you can do a simple test :

    - open the browser with a private "incognito" window and navigate to your website

    - open a normal tab window and start manually  a duplicator job

    - then while the job is running, navigate your website with the incognito mode window

You will see that you won't have issues and the navigation on the website is normal.

 

Will test deeper LazyCron tomorrow. let me know if anything went wrong with the above ;)

 

  • Like 3

Share this post


Link to post
Share on other sites

Really minor, but sometimes the order of local vs Google drive packages is switched. Maybe it would be nice if they were always consistent?

image.png.72f9acc1470716dcb48a3d9faea14117.png

Share this post


Link to post
Share on other sites

Hello
trying to upload to S3 I get "delete" message that I don't undestand. Why files are being deleted?
Locally all files are safe. The s3 user is the same I'm using with other php script.

2017-12-26 17:38:29:  AmazonS3: getting files.
2017-12-26 17:38:29:  - job finished in 16.970454sec
2017-12-26 17:38:29:  AmazonS3: deleted 2017-12-26_17-38-12-www.studiotm.it.package.zip
2017-12-26 17:38:29:  AmazonS3: deleting files...
2017-12-26 17:38:29:  AmazonS3: upload complete to https://s3-eu-west-1.amazonaws.com/htmlfactory-bkp/2017-12-26_17-38-12-www.studiotm.it.package.zip on bucket htmlfactory-bkp.
2017-12-26 17:38:25:  AmazonS3: uploading file /home/studiotm/public_html/site/assets/backups/2017-12-26_17-38-12-www.studiotm.it.package.zip
2017-12-26 17:38:25:  - package saved in local folder: 2017-12-26_17-38-12-www.studiotm.it.package.zip
2017-12-26 17:38:25:  - package built successfully in 12.817685sec
2017-12-26 17:38:24:  - structure zipped successfully in 11.472255sec

image.thumb.png.89d5b130e53a7d601bfc79a555d37a0b.png

Thanks and...  happy holidays :)

Piero

Share this post


Link to post
Share on other sites

Hi Piero  ,

just to be sure - as I tested it on two versions I don't get the package deleted - did you changed the following settings ?

  •  Remove backup packages older than
  •  Maximum number of packages

 

Share this post


Link to post
Share on other sites

 

I'm trying to use the FTP backup function to a server and repeatedly get an error in the logs.

2017-12-27 09:30:11:  [FTP] error: cannot connect to <sftp://domain.ddns.me> on port <22>

I've checked and double checked my credentials and have 4 separate FTP applications here open and connecting with the same details.

Have also tried and re-tried a combination of regular FTP mode, port 21 V 22, passive mode etc.

Are there any know FTP issues reported?

 

 

Share this post


Link to post
Share on other sites

Which operating system and which php version are you using?

Share this post


Link to post
Share on other sites

@jeve  please change the "Remove backup packages older than" setting to "Never" and try again then let me know :)

 

@Peter Knight are you sure you are not mixing SFTP and FTPS ?  SFTP is based on ssh protocol which is not supported.

  • Like 1

Share this post


Link to post
Share on other sites
20 minutes ago, flydev said:

@jeve  please change the "Remove backup packages older than" setting to "Never" and try again then let me know :)

 

@Peter Knight are you sure you are not mixing SFTP and FTPS ?  SFTP is based on ssh protocol which is not supported.

@fly

Yes - Most likely. Thanks for spotting that.

I only have the following protocols enabled enabled on my destination so if It can connect on FTPS then it could be a port number issue?

  • FTPS (port 21)
  • SFTP (port 22)

I turned on normal FTP mode and had more success but not 100% success

2017-12-27 10:07:11:  - job finished in 34.438614sec
2017-12-27 10:07:11:  [FTP] error: cannot upload file <2017-12-27_10-06-37-www.mysite.com.package.zip> on server.
2017-12-27 10:07:11:  FTP: starting upload of /var/www/vhosts/mysite.com/httpdocs/site/assets/backups/2017-12-27_10-06-37-www.mysite.com.package.zip
2017-12-27 10:07:11:  FTP: directory listing of BackupFolder successfull.
2017-12-27 10:07:11:  FTP: retrieving directory listing of BackupFolder...
2017-12-27 10:07:11:  FTP: directory listing of ShareFolder successfull.
2017-12-27 10:07:11:  FTP: retrieving directory listing of Sharey01...
2017-12-27 10:07:11:  FTP: logged in.
2017-12-27 10:07:11:  FTP: connection established.
2017-12-27 10:07:11:  FTP: connecting to domain.ddns.me:21...
2017-12-27 10:07:11:  - package saved in local folder: 2017-12-27_10-06-37-www.mysite.com.package.zip

 

 

Share this post


Link to post
Share on other sites

Good and did you tested to upload a file with a regular FTP client ? look like a permission issue.  Check your PM.

 

Edit :  @Peter Knight please enable the "Passive Mode" setting in Duplicator > FTP settings

ftp.thumb.png.a64fbaf7d5601b889f51418fdbac7a50.png

  • Like 1

Share this post


Link to post
Share on other sites
18 minutes ago, flydev said:

Good and did you tested to upload a file with a regular FTP client ? look like a permission issue.  Check your PM.

 

Edit :  @Peter Knight please enable the "Passive Mode" setting in Duplicator > FTP settings

ftp.thumb.png.a64fbaf7d5601b889f51418fdbac7a50.png

Finally working - thanks for the help.

Steps to resolve.

1. Use normal FTP mode instead of FTPS or SFTP

2. In the host field I was using ftp://hostname.ddns.me instead of just hostname.ddns.me

3. Use passive mode as you suggest

Thanks again. Really nice Module you have there. I normally do backups to a VPS because it has inbuilt drop-box backups. However it's getting to be an expensive entreprise so storing them on my own NAS like this will really help. Looking forward to DropBox integration too. I notice it's temporarily parked.

  • Like 2

Share this post


Link to post
Share on other sites
8 minutes ago, jeve said:

I make the change but s3 still deleting files..

I will look at it tonight, its strange as I can't reproduce it. I set the same package name, same settings then the package is uploaded but nothing deleted 🙈

Could you tell me if the bucket already contain other files ?

Edit: my logs :

Spoiler

2017-12-27 12:05:25:  AmazonS3: getting files.
2017-12-27 12:05:25:  - job finished in 8.614877sec
2017-12-27 12:05:24:  AmazonS3: upload complete to https://duplicator-flydev-tokyo.s3.***.amazonaws.com/2017-12-27_12-05-17-www.studiotm.it.package.zip on bucket duplicator-flydev-tokyo.
2017-12-27 12:05:19:  AmazonS3: uploading file /www/sites/blog/wwwroot/web_backups/2017-12-27_12-05-17-www.studiotm.it.package.zip
2017-12-27 12:05:17:  - package saved in local folder: 2017-12-27_12-05-17-www.studiotm.it.package.zip
2017-12-27 12:05:17:  - package built successfully in 0.488209sec
2017-12-27 12:05:17:  - structure zipped successfully in 0.022769sec
2017-12-27 12:05:17:  - advanced option activated: ArchiveFlush
2017-12-27 12:05:07:  AmazonS3: getting files.

 

Edited by flydev
logs
  • Thanks 1

Share this post


Link to post
Share on other sites

If I want this backup task to run every day at X pm/am I correct in thinking I require PWCron Module and I need to paste the appropriate command into the PWCron Modules to run field

4 0 * * * php /var/www/vhosts/mydomain.com/site/modules/PwCron/cron.php >/dev/null 2>&1

Share this post


Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.


  • Recently Browsing   0 members

    No registered users viewing this page.

  • Similar Content

    • By jaro
      This module (github) does with site/assets/files what Ryan's DatabaseBackups module does with the database:
      Backup site/assets Download ZIP archive Upload ZIP archive Restore site/assets Motivation: This module can be the missing part for projects with content backup responsibility on the client's side: The client will be able to download DB and assets/files snapshots through the backend without filesystem access, thus backing up all content themselves.
      Release state alpha – do not use in production environments.
      Credits for the nice UI go to @ryan – I reused most of it and some other code from the DatabaseBackups module.
    • By David Karich
      ProcessWire InputfieldRepeaterMatrixDuplicate
      Thanks to the great ProModule "RepeaterMatrix" I have the possibility to create complex repeater items. With it I have created a quite powerful page builder. Many different content modules, with many more possible design options. The RepeaterMatrix module supports the cloning of items, but only within the same page. Now I often have the case that very design-intensive pages and items are created. If you want to use a content module on a different page (e.g. in the same design), you have to rebuild each item manually every time.
      This module extends the commercial ProModule "RepeaterMatrix" by the function to duplicate repeater items from one page to another page. The condition is that the target field is the same matrix field from which the item is duplicated. This module is currently understood as proof of concept. There are a few limitations that need to be considered. The intention of the module is that this functionality is integrated into the core of RepeaterMatrix and does not require an extra module.
      Check out the screencast
      What the module can do
      Duplicate a repeater item from one page to another No matter how complex the item is Full support for file and image fields Multilingual support Support of Min and Max settings Live synchronization of clipboard between multiple browser tabs. Copy an item and simply switch the browser tab to the target page and you will immediately see the past button Support of multiple RepeaterMatrix fields on one page Configurable which roles and fields are excluded Duplicated items are automatically pasted to the end of the target field and set to hidden status so that changes are not directly published Automatic clipboard update when other items are picked Automatically removes old clipboard data if it is not pasted within 6 hours Delete clipboard itself by clicking the selected item again Benefit: unbelievably fast workflow and content replication What the module can't do
      Before an item can be duplicated in its current version, the source page must be saved. This means that if you make changes to an item and copy this, the old saved state will be duplicated Dynamic loading is currently not possible. Means no AJAX. When pasting, the target page is saved completely No support for nested repeater items. Currently only first level items can be duplicated. Means a repeater field in a repeater field cannot be duplicated. Workaround: simply duplicate the parent item Dynamic reloading and adding of repeater items cannot be registered. Several interfaces and events from the core are missing. The initialization occurs only once after the page load event Changelog
      1.0.4
      Bug fix: Various bug fixes and improvements in live synchronization Bug fix: Items are no longer inserted when the normal save button is clicked. Only when the past button is explicitly clicked Feature: Support of multiple repeater fields in one page Feature: Support of repeater Min/Max settings Feature: Configurable roles and fields Enhancement: Improved clipboard management Enhancement: Documentation improvement Enhancement: Corrected few typos #1 1.0.3
      Feature: Live synchronization Enhancement: Load the module only in the backend Enhancement: Documentation improvement 1.0.2
      Bug fix: Various bug fixes and improvements in JS functions Enhancement: Documentation improvement Enhancement: Corrected few typos 1.0.1
      Bug fix: Various bug fixes and improvements in the duplication process 1.0.0
      Initial release Support this module
      If this module is useful for you, I am very thankful for your small donation: Donate 5,- Euro (via PayPal – or an amount of your choice. Thank you!)
      Download this module
      > Github: https://github.com/FlipZoomMedia/InputfieldRepeaterMatrixDuplicate
      > PW module directory: – soon –
    • By NorbertH
      Is there a hook to do something right after cloning a page ?
      I want the page to be saved right after cloning it either from the button in the tree or from a lister, because saving the page triggers several calculations that are not triggered by just cloning the page.
       
      Thanks in advance !
    • By John W.
      Question 1
      I recently installed PW 3.0.62 for a new site and also have sites running older version of PW 3.x.
      Can I export the database on an older version of PW 3.x and import it to PW 3.0.62 without any issues?
       
      Question 2
      (This is kind of alternative to the above for long term use - and maybe a better solution...)
      On  the sites I've previously built I have templates (home, basic-page, contact) and fields that I commonly use, such as business_name, phone_1.  The last site I built is running PW 3.0.42.  I was considering cloning this into a local site and running the upgrade module to bring it up to PW 3.0.62. From there on out when I start I new project I could just run the PW upgrade module, copy the folder to the location for the new project and duplicate the database using the new projects name.

      So basically, I'll always keep a "blank slate" site that I can just run the PW upgrade on, then duplicate into a new project. This would cut down on the work and time spent having to re-create these common fields, that I use. From there, I would just add fields, templates, etc, specific for the new website project.

      Is this a sound approach to speed up development?

       
×
×
  • Create New...