Peter Knight Posted November 26, 2014 Share Posted November 26, 2014 Had a discussion with a developer friend this morning about hacking (preventing) and backing up sites and databases. Here's some of the methods disussed: PHP scripts hooking into DropBox/Amazon/Google Drive Manually backing up and saving to NAS. Manually backing up, syncing to NAS and then syncing to online service Writing CRON jobs which scan the root for file and folder changes, email you alerts and backup at regular intervals. Online services such as CodeGuard What method(s) do you use to backup your sites. I'm talking about any type of site on any CMS. 2 Link to comment Share on other sites More sharing options...
adrian Posted November 26, 2014 Share Posted November 26, 2014 I have been using these two for a long time. Might be better options out there, as both of these are quite old and haven't been updated in a long time, but they also work great and I am not sure they need updating http://sourceforge.net/projects/automysqlbackup/ http://www.rsnapshot.org/ (incremental backups of the sql backup files and all other site files to a remote server) 1 Link to comment Share on other sites More sharing options...
Peter Knight Posted November 26, 2014 Author Share Posted November 26, 2014 Interesting links. Link to comment Share on other sites More sharing options...
horst Posted November 26, 2014 Share Posted November 26, 2014 this one is rock solid, it is developed since 2003, http://www.mysqldumper.net/features/ 1 Link to comment Share on other sites More sharing options...
renobird Posted November 26, 2014 Share Posted November 26, 2014 I've been using this method Ryan posted. As Ryan mentioned at the end of the post, rsync copies them to my local HD, which is then again backed up via Time Machine. The shell script for backing up the file system is a little more complicated. My version is essentially a tweak of this script on github. The major drawback is that the backups eat into your server space. I've been meaning to explore the option of moving them Dropbox/Amazon. So far I have plenty of space, so I'm just letting it ride. 1 Link to comment Share on other sites More sharing options...
cstevensjr Posted November 26, 2014 Share Posted November 26, 2014 I'm a former Joomla adherent. I have used paid Akeeba products for years and am now trying their stand-alone (Akeeba Solo) PHP backup solution. I've been on vacation for over a month and therefore have not invested too much time into taking this product through it's paces. Link to comment Share on other sites More sharing options...
LostKobrakai Posted November 26, 2014 Share Posted November 26, 2014 I'm on a german shared hosting, which is provided by a some linux devs (no marketing/business people). They provided their backup strategy here, but it's written in german. As User I found it super easy to go through backups and restore stuff, that's why I like it and as hostingprovider this has to be bullet prove. https://wiki.uberspace.de/system:backup If theres some interest I could maybe translate it. Link to comment Share on other sites More sharing options...
Marty Walker Posted November 26, 2014 Share Posted November 26, 2014 All of my sites are on a cPanel VPS so I use the backup to Amazon S3 feature. From there they get shuffled off to Glacier. Previously I used http://www.cpanelbackupscript.com/ Link to comment Share on other sites More sharing options...
Peter Knight Posted December 3, 2014 Author Share Posted December 3, 2014 Quick followup in case anyone else comes to this thread. I had a look at all the links and eventually settled on Akeeba Solos (thanks @cstevensjr) as it was the most compatible with my server. The website is a little unclear but you install it on your server, enter your database credentials and you have quite an array of backup and config options from that point onwards. Setting up my backups (databases and files) with Dropbox was really simple. There's also 3 different options for scheduled backups via Cron. 1 Link to comment Share on other sites More sharing options...
pwired Posted December 3, 2014 Share Posted December 3, 2014 My search for this ended with xcloner: http://www.xcloner.com/ standalone version ! This stuff is for real. 2 Link to comment Share on other sites More sharing options...
gebeer Posted December 4, 2014 Share Posted December 4, 2014 I'm on a german shared hosting, which is provided by a some linux devs (no marketing/business people). They provided their backup strategy here, but it's written in german. As User I found it super easy to go through backups and restore stuff, that's why I like it and as hostingprovider this has to be bullet prove. https://wiki.uberspace.de/system:backup If theres some interest I could maybe translate it. I'm also using this German hoster for some sites. In a nutshell, they are using rsnapshot for their backups. Their backup server is pulling data from the live server. The backups are mounted via NFS to the live server. I came from the Joomla universe to PW. Thus I'm familiar with Akeeba backup which is a great product. Using their free and well maintained standalone version makes backing up PW and other PHP applications a breeze. 1 Link to comment Share on other sites More sharing options...
mr-fan Posted December 4, 2014 Share Posted December 4, 2014 and that this one is not going to hide somewhere.... https://github.com/ryancramerdesign/ProcessDatabaseBackups (should work great for normal sites) since it works from the API you could setup a cron or lazycron to get it automatic (maybe send the backupfile via mail to get the db to a second place away from the webspace?) regards mr-fan 1 Link to comment Share on other sites More sharing options...
Christophe Posted December 4, 2014 Share Posted December 4, 2014 I saw this topic yesterday but pwired was quicker . I often use XCloner Standalone as Akeeba Solos didn't already exist at the moment I had to choose for a standalone version, between Akeeba Backup standalone and XCloner standalone. (I usually have to adjust some "paths" and after that it works well.) You can exclude chosen directories and files from the backup, and also "Store you backups on the Amazon S3 cloud", for example. I don't know if Akeeba Solos (which seems to be the new name for the standalone version) can exclude directories and files... 1 Link to comment Share on other sites More sharing options...
Peter Knight Posted December 4, 2014 Author Share Posted December 4, 2014 @christophe you can exclude directories and files. Link to comment Share on other sites More sharing options...
LostKobrakai Posted December 5, 2014 Share Posted December 5, 2014 BittorrentSync could be a rather simple solution for distributing the filesystem to different machines. 1 Link to comment Share on other sites More sharing options...
pwired Posted December 5, 2014 Share Posted December 5, 2014 BittorrentSync could be a rather simple solution for distributing the filesystem to different machines. Considering the title of this topic: how would you use bittorrentsync to backup a database ? Link to comment Share on other sites More sharing options...
renobird Posted December 5, 2014 Share Posted December 5, 2014 @pwired, I think he was referring to backing up the filesystem by copying it to a remote location. Still on topic. Link to comment Share on other sites More sharing options...
Ivan Gretsky Posted December 12, 2014 Share Posted December 12, 2014 Was about to try XCloner as I know it back from Joomla times and found this vulnerability report, which seems to be not answered in any way by the XCloner team. Searched exploit-db.com for Akeeba and found just one which seems to be not so important one. Actually I would like to use some free script, but that the way it.is. By the way there are no entries about "processwire" there. Link to comment Share on other sites More sharing options...
Sheri82 Posted January 17, 2015 Share Posted January 17, 2015 backup through hard disk Link to comment Share on other sites More sharing options...
Pete Posted January 17, 2015 Share Posted January 17, 2015 I use a variation on some mysqldump commands and zip commands via cron to backup daily to a folder on each site's account, then have a scheduled script on my NAS download the backups each night.I could do with improving it so it sends reports if it couldn't find a backup file but that's an easy enough thing to check.I'm sure I could use the programs on my Synology to then backup to the cloud if I wanted to.Of course if I had hosting at two different webhosts it would all be quicker to transfer backups between them but there's something reassuring about having a backup on a local device 1 Link to comment Share on other sites More sharing options...
Peter Knight Posted January 19, 2015 Author Share Posted January 19, 2015 I'm sure I could use the programs on my Synology to then backup to the cloud if I wanted to. Are you using DSM 5.1? Synology have a few options for you. 1. Backup to Amazon S3 2. Cloud Sync to G Drive, DropBox, Box, etc Link to comment Share on other sites More sharing options...
Skaggs4 Posted January 29, 2015 Share Posted January 29, 2015 (edited) If you want an alternative solution for backup sites and databases then try using CloudBacko Pro. It works for Windows, Mac, Linux etc and provides some pretty useful features. I am leaving a link with this comment go for it. feel free to use it Edited January 30, 2015 by kongondo @Skaggs4: your post looks suspiciously like spam/unauthorised advertisement to me. I have removed the link to the outside resource Link to comment Share on other sites More sharing options...
OrganizedFellow Posted October 9, 2015 Share Posted October 9, 2015 Revisiting an old thread: On my Debian box, I have a dump.sh script that I run manually. Also have a git hook pre-commit that runs the same script. Then I git commit to my gitlab account and rsync my site to live site. I want to automate as much as possible and learn bash as I go. My next script (once i figure out how) will rsync the latest dump.sql to the live site and import. --- EDIT, adding below: On second thought. Since I am using git. Is there a way, to just dump the latest changes in the database, and have that teeny tiny sql file, instead of dumping the whole thing? Mine is still relatively small, but over time, it will grow. Link to comment Share on other sites More sharing options...
Adam Kiss Posted October 9, 2015 Share Posted October 9, 2015 I don't think there is anything simple to diff database vs the dump; (additionaly, if you do the diff dump only, you'd then have to diff database versus set of dumps… yeah.) 1 Link to comment Share on other sites More sharing options...
OrganizedFellow Posted October 12, 2015 Share Posted October 12, 2015 I don't think there is anything simple to diff database vs the dump; (additionaly, if you do the diff dump only, you'd then have to diff database versus set of dumps… yeah.) http://dbv.vizuina.com/ My Googleism is pretty amazing today, lol. Looks like a pretty cool project. 1 Link to comment Share on other sites More sharing options...
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now