Jump to content

Backup sites and databases


Peter Knight
 Share

Recommended Posts

Had a discussion with a developer friend this morning about hacking (preventing) and backing up sites and databases.

Here's some of the methods disussed:

  • PHP scripts hooking into DropBox/Amazon/Google Drive
  • Manually backing up and saving to NAS. 
  • Manually backing up, syncing to NAS and then syncing to online service
  • Writing CRON jobs which scan the root for file and folder changes, email you alerts and backup at regular intervals.
  • Online services such as CodeGuard

What method(s) do you use to backup your sites. I'm talking about any type of site on any CMS.

  • Like 2
Link to comment
Share on other sites

I have been using these two for a long time. Might be better options out there, as both of these are quite old and haven't been updated in a long time, but they also work great and I am not sure they need updating :)

http://sourceforge.net/projects/automysqlbackup/ 

http://www.rsnapshot.org/ (incremental backups of the sql backup files and all other site files to a remote server)

  • Like 1
Link to comment
Share on other sites

I've been using this method Ryan posted. As Ryan mentioned at the end of the post, rsync copies them to my local HD, which is then again backed up via Time Machine. The shell script for backing up the file system is a little more complicated. My version is essentially a tweak of this script on github.

The major drawback is that the backups eat into your server space. I've been meaning to explore the option of moving them Dropbox/Amazon.

So far I have plenty of space, so I'm just letting it ride.

  • Like 1
Link to comment
Share on other sites

I'm on a german shared hosting, which is provided by a some linux devs (no marketing/business people). They provided their backup strategy here, but it's written in german. As User I found it super easy to go through backups and restore stuff, that's why I like it and as hostingprovider this has to be bullet prove.

https://wiki.uberspace.de/system:backup

If theres some interest I could maybe translate it. 

Link to comment
Share on other sites

Quick followup in case anyone else comes to this thread.

I had a look at all the links and eventually settled on Akeeba Solos (thanks @cstevensjr) as it was the most compatible with my server.

The website is a little unclear but you install it on your server, enter your database credentials and you have quite an array of backup and config options from that point onwards.

Setting up my backups (databases and files) with Dropbox was really simple. There's also 3 different options for scheduled backups via Cron.

  • Like 1
Link to comment
Share on other sites

I'm on a german shared hosting, which is provided by a some linux devs (no marketing/business people). They provided their backup strategy here, but it's written in german. As User I found it super easy to go through backups and restore stuff, that's why I like it and as hostingprovider this has to be bullet prove.

https://wiki.uberspace.de/system:backup

If theres some interest I could maybe translate it. 

I'm also using this German hoster for some sites. In a nutshell, they are using rsnapshot for their backups. Their backup server is pulling data from the live server. The backups are mounted via NFS to the live server.

I came from the Joomla universe to PW. Thus I'm familiar with Akeeba backup which is a great product. Using their free and well maintained standalone version makes backing up PW and other PHP applications a breeze.

  • Like 1
Link to comment
Share on other sites

and that this one is not going to hide somewhere....

https://github.com/ryancramerdesign/ProcessDatabaseBackups

(should work great for normal sites)

since it works from the API you could setup a cron or lazycron to get it automatic (maybe send the backupfile via mail to get the db to a second place away from the webspace?)

regards mr-fan

  • Like 1
Link to comment
Share on other sites

I saw this topic yesterday but pwired was quicker :).

I often use XCloner Standalone as Akeeba Solos didn't already exist at the moment I had to choose for a standalone version, between Akeeba Backup standalone and XCloner standalone.

(I usually have to adjust some "paths" and after that it works well.)

You can exclude chosen directories and files from the backup, and also "Store you backups on the Amazon S3 cloud", for example.

I don't know if Akeeba Solos (which seems to be the new name for the standalone version) can exclude directories and files...

  • Like 1
Link to comment
Share on other sites

Was about to try XCloner as I know it back from Joomla times and found this vulnerability report, which seems to be not answered in any way by the XCloner team. Searched exploit-db.com for Akeeba and found just one which seems to be not so important one. Actually I would like to use some free script, but that the way it.is.

By the way there are no entries about "processwire" there.

Link to comment
Share on other sites

  • 1 month later...

I use a variation on some mysqldump commands and zip commands via cron to backup daily to a folder on each site's account, then have a scheduled script on my NAS download the backups each night.

I could do with improving it so it sends reports if it couldn't find a backup file but that's an easy enough thing to check.

I'm sure I could use the programs on my Synology to then backup to the cloud if I wanted to.

Of course if I had hosting at two different webhosts it would all be quicker to transfer backups between them but there's something reassuring about having a backup on a local device :)

  • Like 1
Link to comment
Share on other sites

  • 2 weeks later...

If you want an alternative solution for backup sites and databases then try using CloudBacko Pro.  It works for Windows, Mac, Linux etc and provides some pretty useful features. I am leaving a link with this comment go for it. feel free to use it

Edited by kongondo
@Skaggs4: your post looks suspiciously like spam/unauthorised advertisement to me. I have removed the link to the outside resource
Link to comment
Share on other sites

  • 8 months later...

Revisiting an old thread:

On my Debian box, I have a dump.sh script that I run manually. Also have a git hook pre-commit that runs the same script. Then I git commit to my gitlab account and rsync my site to live site.

I want to automate as much as possible and learn bash as I go.

My next script (once i figure out how) will rsync the latest dump.sql to the live site and import.

---

EDIT, adding below:

On second thought. Since I am using git. Is there a way, to just dump the latest changes in the database, and have that teeny tiny sql file, instead of dumping the whole thing?

Mine is still relatively small, but over time, it will grow.

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
 Share

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...