Jump to content

☁️ Duplicator: Backup and move sites


flydev

Recommended Posts

@flydev the clear log button will be helpful, thanks. I have one request: when upgrading to the latest version, I have to manually move the SDK files from the old folder (starting with a dot: ".Duplicator") to the new, updated one. Is there a way to automate this task? It's easy to do manually but I have to remember to do it each time I upgrade.

Link to comment
Share on other sites

I see, its not possible with ProcessWireUpgrade as the SDKs are not bundled with the module. I will try to find a workaround, like a button to import the SDKs from a previous duplicator version. If you have a better idea, I am open :)

Link to comment
Share on other sites

@flydev I have installed 1.1.7 now.

Not sure yet if it works (set to lazyCron, every hour).

Meanwhile, I installed GoogleDrive SDK via Composer. After reloading the module config page, I can't click on GoogleDrive option. Where do I put the vendor folder from Composer? And why is https://github.com/flydev-fr/duplicator-google-drive an error 404 page?

  • Like 1
Link to comment
Share on other sites

32 minutes ago, dragan said:

Not sure yet if it works (set to lazyCron, every hour).

It will, after you visit a page, in an hour. (if you wanna try LazyCron, set it to "LogoutTrigger" and logout from the admin then log-in again and check your packages).

 

32 minutes ago, dragan said:

Where do I put the vendor folder from Composer?

How are you running composer ? On localhost and uploading the vendor folder; or directly on the server ?

The vendor folder should be along side the "wire" and "site" folder.

5a2fea06f32a2_ftptree.png.8efa13afc5a2f9f7578231d41640c310.png

 

32 minutes ago, dragan said:

The right repo is there :  https://github.com/flydev-fr/Duplicator-SDKs

 

Edited by flydev
screenshot
Link to comment
Share on other sites

Big thanks to @flydev - he offered to help via TeamViewer.

However, when I installed the latest version on the remote server, it didn't work. I set it to 30 seconds, with LazyCron, and nothing happened. I clicked around several pages in FE + BE... Only via manually triggering a backup via package manager did I get a backup.

And even then, the entire site was frozen. You couldn't do anything (front- or backend), in the time it was creating the backup. I understand such a process takes lots of resources, but is this expected behavior?

  • Like 1
Link to comment
Share on other sites

2 hours ago, dragan said:

And even then, the entire site was frozen. You couldn't do anything (front- or backend), in the time it was creating the backup. I understand such a process takes lots of resources, but is this expected behavior?

if that can reassure you, what you are experiencing here is normal and it has to do with php session locking mechanism.

To be sure, you can do a simple test :

    - open the browser with a private "incognito" window and navigate to your website

    - open a normal tab window and start manually  a duplicator job

    - then while the job is running, navigate your website with the incognito mode window

You will see that you won't have issues and the navigation on the website is normal.

 

Will test deeper LazyCron tomorrow. let me know if anything went wrong with the above ;)

 

  • Like 3
Link to comment
Share on other sites

  • 2 weeks later...

Hello
trying to upload to S3 I get "delete" message that I don't undestand. Why files are being deleted?
Locally all files are safe. The s3 user is the same I'm using with other php script.

2017-12-26 17:38:29:  AmazonS3: getting files.
2017-12-26 17:38:29:  - job finished in 16.970454sec
2017-12-26 17:38:29:  AmazonS3: deleted 2017-12-26_17-38-12-www.studiotm.it.package.zip
2017-12-26 17:38:29:  AmazonS3: deleting files...
2017-12-26 17:38:29:  AmazonS3: upload complete to https://s3-eu-west-1.amazonaws.com/htmlfactory-bkp/2017-12-26_17-38-12-www.studiotm.it.package.zip on bucket htmlfactory-bkp.
2017-12-26 17:38:25:  AmazonS3: uploading file /home/studiotm/public_html/site/assets/backups/2017-12-26_17-38-12-www.studiotm.it.package.zip
2017-12-26 17:38:25:  - package saved in local folder: 2017-12-26_17-38-12-www.studiotm.it.package.zip
2017-12-26 17:38:25:  - package built successfully in 12.817685sec
2017-12-26 17:38:24:  - structure zipped successfully in 11.472255sec

image.thumb.png.89d5b130e53a7d601bfc79a555d37a0b.png

Thanks and...  happy holidays :)

Piero

Link to comment
Share on other sites

 

I'm trying to use the FTP backup function to a server and repeatedly get an error in the logs.

2017-12-27 09:30:11:  [FTP] error: cannot connect to <sftp://domain.ddns.me> on port <22>

I've checked and double checked my credentials and have 4 separate FTP applications here open and connecting with the same details.

Have also tried and re-tried a combination of regular FTP mode, port 21 V 22, passive mode etc.

Are there any know FTP issues reported?

 

 

Link to comment
Share on other sites

20 minutes ago, flydev said:

@jeve  please change the "Remove backup packages older than" setting to "Never" and try again then let me know :)

 

@Peter Knight are you sure you are not mixing SFTP and FTPS ?  SFTP is based on ssh protocol which is not supported.

@fly

Yes - Most likely. Thanks for spotting that.

I only have the following protocols enabled enabled on my destination so if It can connect on FTPS then it could be a port number issue?

  • FTPS (port 21)
  • SFTP (port 22)

I turned on normal FTP mode and had more success but not 100% success

2017-12-27 10:07:11:  - job finished in 34.438614sec
2017-12-27 10:07:11:  [FTP] error: cannot upload file <2017-12-27_10-06-37-www.mysite.com.package.zip> on server.
2017-12-27 10:07:11:  FTP: starting upload of /var/www/vhosts/mysite.com/httpdocs/site/assets/backups/2017-12-27_10-06-37-www.mysite.com.package.zip
2017-12-27 10:07:11:  FTP: directory listing of BackupFolder successfull.
2017-12-27 10:07:11:  FTP: retrieving directory listing of BackupFolder...
2017-12-27 10:07:11:  FTP: directory listing of ShareFolder successfull.
2017-12-27 10:07:11:  FTP: retrieving directory listing of Sharey01...
2017-12-27 10:07:11:  FTP: logged in.
2017-12-27 10:07:11:  FTP: connection established.
2017-12-27 10:07:11:  FTP: connecting to domain.ddns.me:21...
2017-12-27 10:07:11:  - package saved in local folder: 2017-12-27_10-06-37-www.mysite.com.package.zip

 

 

Link to comment
Share on other sites

18 minutes ago, flydev said:

Good and did you tested to upload a file with a regular FTP client ? look like a permission issue.  Check your PM.

 

Edit :  @Peter Knight please enable the "Passive Mode" setting in Duplicator > FTP settings

ftp.thumb.png.a64fbaf7d5601b889f51418fdbac7a50.png

Finally working - thanks for the help.

Steps to resolve.

1. Use normal FTP mode instead of FTPS or SFTP

2. In the host field I was using ftp://hostname.ddns.me instead of just hostname.ddns.me

3. Use passive mode as you suggest

Thanks again. Really nice Module you have there. I normally do backups to a VPS because it has inbuilt drop-box backups. However it's getting to be an expensive entreprise so storing them on my own NAS like this will really help. Looking forward to DropBox integration too. I notice it's temporarily parked.

  • Like 2
Link to comment
Share on other sites

8 minutes ago, jeve said:

I make the change but s3 still deleting files..

I will look at it tonight, its strange as I can't reproduce it. I set the same package name, same settings then the package is uploaded but nothing deleted ?

Could you tell me if the bucket already contain other files ?

Edit: my logs :

Spoiler

2017-12-27 12:05:25:  AmazonS3: getting files.
2017-12-27 12:05:25:  - job finished in 8.614877sec
2017-12-27 12:05:24:  AmazonS3: upload complete to https://duplicator-flydev-tokyo.s3.***.amazonaws.com/2017-12-27_12-05-17-www.studiotm.it.package.zip on bucket duplicator-flydev-tokyo.
2017-12-27 12:05:19:  AmazonS3: uploading file /www/sites/blog/wwwroot/web_backups/2017-12-27_12-05-17-www.studiotm.it.package.zip
2017-12-27 12:05:17:  - package saved in local folder: 2017-12-27_12-05-17-www.studiotm.it.package.zip
2017-12-27 12:05:17:  - package built successfully in 0.488209sec
2017-12-27 12:05:17:  - structure zipped successfully in 0.022769sec
2017-12-27 12:05:17:  - advanced option activated: ArchiveFlush
2017-12-27 12:05:07:  AmazonS3: getting files.

 

Edited by flydev
logs
  • Thanks 1
Link to comment
Share on other sites

If I want this backup task to run every day at X pm/am I correct in thinking I require PWCron Module and I need to paste the appropriate command into the PWCron Modules to run field

4 0 * * * php /var/www/vhosts/mydomain.com/site/modules/PwCron/cron.php >/dev/null 2>&1

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...