flydev

Duplicator: Backup and move sites

Recommended Posts

@flydev the clear log button will be helpful, thanks. I have one request: when upgrading to the latest version, I have to manually move the SDK files from the old folder (starting with a dot: ".Duplicator") to the new, updated one. Is there a way to automate this task? It's easy to do manually but I have to remember to do it each time I upgrade.

Share this post


Link to post
Share on other sites

I see, its not possible with ProcessWireUpgrade as the SDKs are not bundled with the module. I will try to find a workaround, like a button to import the SDKs from a previous duplicator version. If you have a better idea, I am open :)

Share this post


Link to post
Share on other sites

@flydev I have installed 1.1.7 now.

Not sure yet if it works (set to lazyCron, every hour).

Meanwhile, I installed GoogleDrive SDK via Composer. After reloading the module config page, I can't click on GoogleDrive option. Where do I put the vendor folder from Composer? And why is https://github.com/flydev-fr/duplicator-google-drive an error 404 page?

  • Like 1

Share this post


Link to post
Share on other sites
35 minutes ago, flydev said:

like a button to import the SDKs from a previous duplicator version.

+1 Along with a hard to miss notification maybe?

Share this post


Link to post
Share on other sites
32 minutes ago, dragan said:

Not sure yet if it works (set to lazyCron, every hour).

It will, after you visit a page, in an hour. (if you wanna try LazyCron, set it to "LogoutTrigger" and logout from the admin then log-in again and check your packages).

 

32 minutes ago, dragan said:

Where do I put the vendor folder from Composer?

How are you running composer ? On localhost and uploading the vendor folder; or directly on the server ?

The vendor folder should be along side the "wire" and "site" folder.

5a2fea06f32a2_ftptree.png.8efa13afc5a2f9f7578231d41640c310.png

 

32 minutes ago, dragan said:

The right repo is there :  https://github.com/flydev-fr/Duplicator-SDKs

 

Edited by flydev
screenshot

Share this post


Link to post
Share on other sites

@flydev Yes, I try it out locally first. So: vendor, site + wire are all on the same level?

The local backup (triggered now twice with logout) didn't work. See screenshot of current module config.

duplicator-module-config.png

Share this post


Link to post
Share on other sites

Big thanks to @flydev - he offered to help via TeamViewer.

However, when I installed the latest version on the remote server, it didn't work. I set it to 30 seconds, with LazyCron, and nothing happened. I clicked around several pages in FE + BE... Only via manually triggering a backup via package manager did I get a backup.

And even then, the entire site was frozen. You couldn't do anything (front- or backend), in the time it was creating the backup. I understand such a process takes lots of resources, but is this expected behavior?

  • Like 1

Share this post


Link to post
Share on other sites
2 hours ago, dragan said:

And even then, the entire site was frozen. You couldn't do anything (front- or backend), in the time it was creating the backup. I understand such a process takes lots of resources, but is this expected behavior?

if that can reassure you, what you are experiencing here is normal and it has to do with php session locking mechanism.

To be sure, you can do a simple test :

    - open the browser with a private "incognito" window and navigate to your website

    - open a normal tab window and start manually  a duplicator job

    - then while the job is running, navigate your website with the incognito mode window

You will see that you won't have issues and the navigation on the website is normal.

 

Will test deeper LazyCron tomorrow. let me know if anything went wrong with the above ;)

 

  • Like 3

Share this post


Link to post
Share on other sites

Really minor, but sometimes the order of local vs Google drive packages is switched. Maybe it would be nice if they were always consistent?

image.png.72f9acc1470716dcb48a3d9faea14117.png

Share this post


Link to post
Share on other sites

Hello
trying to upload to S3 I get "delete" message that I don't undestand. Why files are being deleted?
Locally all files are safe. The s3 user is the same I'm using with other php script.

2017-12-26 17:38:29:  AmazonS3: getting files.
2017-12-26 17:38:29:  - job finished in 16.970454sec
2017-12-26 17:38:29:  AmazonS3: deleted 2017-12-26_17-38-12-www.studiotm.it.package.zip
2017-12-26 17:38:29:  AmazonS3: deleting files...
2017-12-26 17:38:29:  AmazonS3: upload complete to https://s3-eu-west-1.amazonaws.com/htmlfactory-bkp/2017-12-26_17-38-12-www.studiotm.it.package.zip on bucket htmlfactory-bkp.
2017-12-26 17:38:25:  AmazonS3: uploading file /home/studiotm/public_html/site/assets/backups/2017-12-26_17-38-12-www.studiotm.it.package.zip
2017-12-26 17:38:25:  - package saved in local folder: 2017-12-26_17-38-12-www.studiotm.it.package.zip
2017-12-26 17:38:25:  - package built successfully in 12.817685sec
2017-12-26 17:38:24:  - structure zipped successfully in 11.472255sec

image.thumb.png.89d5b130e53a7d601bfc79a555d37a0b.png

Thanks and...  happy holidays :)

Piero

Share this post


Link to post
Share on other sites

Hi Piero  ,

just to be sure - as I tested it on two versions I don't get the package deleted - did you changed the following settings ?

  •  Remove backup packages older than
  •  Maximum number of packages

 

Share this post


Link to post
Share on other sites

 

I'm trying to use the FTP backup function to a server and repeatedly get an error in the logs.

2017-12-27 09:30:11:  [FTP] error: cannot connect to <sftp://domain.ddns.me> on port <22>

I've checked and double checked my credentials and have 4 separate FTP applications here open and connecting with the same details.

Have also tried and re-tried a combination of regular FTP mode, port 21 V 22, passive mode etc.

Are there any know FTP issues reported?

 

 

Share this post


Link to post
Share on other sites

Which operating system and which php version are you using?

Share this post


Link to post
Share on other sites

@jeve  please change the "Remove backup packages older than" setting to "Never" and try again then let me know :)

 

@Peter Knight are you sure you are not mixing SFTP and FTPS ?  SFTP is based on ssh protocol which is not supported.

  • Like 1

Share this post


Link to post
Share on other sites
20 minutes ago, flydev said:

@jeve  please change the "Remove backup packages older than" setting to "Never" and try again then let me know :)

 

@Peter Knight are you sure you are not mixing SFTP and FTPS ?  SFTP is based on ssh protocol which is not supported.

@fly

Yes - Most likely. Thanks for spotting that.

I only have the following protocols enabled enabled on my destination so if It can connect on FTPS then it could be a port number issue?

  • FTPS (port 21)
  • SFTP (port 22)

I turned on normal FTP mode and had more success but not 100% success

2017-12-27 10:07:11:  - job finished in 34.438614sec
2017-12-27 10:07:11:  [FTP] error: cannot upload file <2017-12-27_10-06-37-www.mysite.com.package.zip> on server.
2017-12-27 10:07:11:  FTP: starting upload of /var/www/vhosts/mysite.com/httpdocs/site/assets/backups/2017-12-27_10-06-37-www.mysite.com.package.zip
2017-12-27 10:07:11:  FTP: directory listing of BackupFolder successfull.
2017-12-27 10:07:11:  FTP: retrieving directory listing of BackupFolder...
2017-12-27 10:07:11:  FTP: directory listing of ShareFolder successfull.
2017-12-27 10:07:11:  FTP: retrieving directory listing of Sharey01...
2017-12-27 10:07:11:  FTP: logged in.
2017-12-27 10:07:11:  FTP: connection established.
2017-12-27 10:07:11:  FTP: connecting to domain.ddns.me:21...
2017-12-27 10:07:11:  - package saved in local folder: 2017-12-27_10-06-37-www.mysite.com.package.zip

 

 

Share this post


Link to post
Share on other sites

Good and did you tested to upload a file with a regular FTP client ? look like a permission issue.  Check your PM.

 

Edit :  @Peter Knight please enable the "Passive Mode" setting in Duplicator > FTP settings

ftp.thumb.png.a64fbaf7d5601b889f51418fdbac7a50.png

  • Like 1

Share this post


Link to post
Share on other sites
18 minutes ago, flydev said:

Good and did you tested to upload a file with a regular FTP client ? look like a permission issue.  Check your PM.

 

Edit :  @Peter Knight please enable the "Passive Mode" setting in Duplicator > FTP settings

ftp.thumb.png.a64fbaf7d5601b889f51418fdbac7a50.png

Finally working - thanks for the help.

Steps to resolve.

1. Use normal FTP mode instead of FTPS or SFTP

2. In the host field I was using ftp://hostname.ddns.me instead of just hostname.ddns.me

3. Use passive mode as you suggest

Thanks again. Really nice Module you have there. I normally do backups to a VPS because it has inbuilt drop-box backups. However it's getting to be an expensive entreprise so storing them on my own NAS like this will really help. Looking forward to DropBox integration too. I notice it's temporarily parked.

  • Like 2

Share this post


Link to post
Share on other sites
8 minutes ago, jeve said:

I make the change but s3 still deleting files..

I will look at it tonight, its strange as I can't reproduce it. I set the same package name, same settings then the package is uploaded but nothing deleted 🙈

Could you tell me if the bucket already contain other files ?

Edit: my logs :

Spoiler

2017-12-27 12:05:25:  AmazonS3: getting files.
2017-12-27 12:05:25:  - job finished in 8.614877sec
2017-12-27 12:05:24:  AmazonS3: upload complete to https://duplicator-flydev-tokyo.s3.***.amazonaws.com/2017-12-27_12-05-17-www.studiotm.it.package.zip on bucket duplicator-flydev-tokyo.
2017-12-27 12:05:19:  AmazonS3: uploading file /www/sites/blog/wwwroot/web_backups/2017-12-27_12-05-17-www.studiotm.it.package.zip
2017-12-27 12:05:17:  - package saved in local folder: 2017-12-27_12-05-17-www.studiotm.it.package.zip
2017-12-27 12:05:17:  - package built successfully in 0.488209sec
2017-12-27 12:05:17:  - structure zipped successfully in 0.022769sec
2017-12-27 12:05:17:  - advanced option activated: ArchiveFlush
2017-12-27 12:05:07:  AmazonS3: getting files.

 

Edited by flydev
logs
  • Thanks 1

Share this post


Link to post
Share on other sites

If I want this backup task to run every day at X pm/am I correct in thinking I require PWCron Module and I need to paste the appropriate command into the PWCron Modules to run field

4 0 * * * php /var/www/vhosts/mydomain.com/site/modules/PwCron/cron.php >/dev/null 2>&1

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

  • Recently Browsing   0 members

    No registered users viewing this page.

  • Similar Content

    • By John W.
      Question 1
      I recently installed PW 3.0.62 for a new site and also have sites running older version of PW 3.x.
      Can I export the database on an older version of PW 3.x and import it to PW 3.0.62 without any issues?
       
      Question 2
      (This is kind of alternative to the above for long term use - and maybe a better solution...)
      On  the sites I've previously built I have templates (home, basic-page, contact) and fields that I commonly use, such as business_name, phone_1.  The last site I built is running PW 3.0.42.  I was considering cloning this into a local site and running the upgrade module to bring it up to PW 3.0.62. From there on out when I start I new project I could just run the PW upgrade module, copy the folder to the location for the new project and duplicate the database using the new projects name.

      So basically, I'll always keep a "blank slate" site that I can just run the PW upgrade on, then duplicate into a new project. This would cut down on the work and time spent having to re-create these common fields, that I use. From there, I would just add fields, templates, etc, specific for the new website project.

      Is this a sound approach to speed up development?

       
    • By John W.
      I just posted a question about moving PW from a resting domain to a production domain, after consideration I figure I would create a new post for a second question I have.
      I noticed the coming feature of client size image resizing. This will be really helpful, as I have some clients on a Satellite connection. I'm telling you, them trying to upload 15MB images, or them getting them straight on batch resizers (or Photoshop) to resize before upload is a real chore. Needless to say, Ryan has been doing an excellent job addressing this.
      My question, is that I have several clients on PW 2.x thru 3.x. Is there a really good guide on steps that I need to take to update the core to take advantage of the features of new releases? Are their perils to look out for?
      Yeah, after developing several sites, I feel a bit silly about asking this one, but, the docs doesn't seem to be clear in a "general" sense of upgrading the core across all versions.
      Thanks everyone for your time and assistance.
    • By suntrop
      Hi all
      I have big troubles with my DB backups and phpMyAdmin. I can't manage to import a backup.
      The phpMyAdmin export settings are left to the defaults. The error message I get:
      #1064 - You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near 'INSERT INTO `caches` (`name`, `data`, `expires`) VALUES ('ModulesUninstalled.in' at line 11 When I delete the complete 'cache' section there are almost for all tables the same errors. (part of the dump attached)
      I had this problem before and back then (couple months ago) I think I copied almost every table individually. I thought it was specific to that DB. But I can't do a dump/import of any of my PW installations.
      MySQL: 5.6.27
      PHP version: 7.0.11
      phpMyAdmin: 4.5.5.1
      Anybody knows that problem? Or what to do?
      phpMyAdmin SQL Dump.sql
    • By vwatson
      Is it possible to add check boxes in order to select multiple backup files and delete them all at once, rather than having to click each one, then Delete?
      Also it would be helpful to make the Restore process more distinct from Deleting.  I must have inadvertently clicked the wrong button once and restored a backup unintentionally, wiping out a day's work. My mistake of course. I obviously wasn't paying close enough attention.  
    • By Fantomas
      Hi all,
      I'm still new to processwire and now I have to create the first hook.
      I need to modify the page-clone module, so there is a field in pages that has to be empty after cloning the page. How could I start this task?