If you’re reading this article, you are likely aware of the benefits of maintaining backups for your website. If you happened to stumble upon this, please take note; A successful backup can save hours, weeks, or months of work. With larger deployments, a successful backup can save years of work from disaster. It doesn’t stop there though. While some people may feel comfortable with a single point of restore; I, the slightly tempered technical support analyst, insist I create another point of restore. I mean, who knows? My data could get tempered by a geomagnetic storm. Maybe a stray meteorite could crash into my computer? The point is, my cat would ultimately be responsible for it. In this article we will concentrate on 3 things. New data, existing data, and any infrastructure data that coincides with the existence of our data. Since I don’t want to write a whole book on this subject, I will outline a few methods for the three largest PC operating systems. There are also many pre-built plugins available to maintain a backup for your favorite CMS. If you’re interested in going this route, please reference your favorite search engine for backup plugins. New data: Once you’ve started your new project, It’s important to identify the hot-key in order to create a save point. It will save you both time, and headaches. Most software recognizes the universal hot-key for saving, Ctrl+S. While writing this page, I’ve saved this at least two dozen times. All thanks to this little wonder. Frequent saves may seem excessive, but some disasters can be attributed to a level of unpreparedness. Some software even offers automated saving, so be sure to search for available functionality via your favorite search engine. Existing data: Now that my data is saved, I don’t want to go through the trouble of constantly backing up what I already have saved. It would ultimately break my work flow, and I would probably end up browsing a large “social news website where the registered users submit content, in the form of either a link or a text “self” post. Other users then vote the submission “up” or “down,” which is used to rank the post and determine its position on the site’s pages and front page.” And who wants that, honestly? - Linux / Mac While a full backup is ran every week, the important bits are saved quite frequently. Every 15 minutes, I sync various directories to a remote destination. Since I’m currently operating inside a virtual machine running Linux, I can easily accomplish this through a cron and rsync from the remote system:
rsync -avz [user]@[remote address]:[remote directory] [local directory](You can easily reverse this to send data to the remote system.)
rsync -avz [local directory] [user]@[remote address]:[remote directory](Remember to completely omit the [sample] with your own values. Reflecting the first example, the final value with the cron data included will look something like this)
0,15,30,45 * * * * rsync -avz firstname.lastname@example.org:/home/kevin/documents /home/backups/spot/home/kevin/documentsI use certificate based authentication for users, so an entered password isn’t required to provide verification. To learn more about certificate based authentication, please read this article here: http://www.laubenheimer.net/ssh-keys.shtml Every cron can be created to reflect various time frames. To avoid the cronjob manual, you can generate a cronjob via several websites whom offer cronjob generation. http://www.crontab-generator.com/ is a simple one I found via Google. Mac computers offer the use of rsync, and cronjobs/scheduled tasks. The previous example would work for both operating systems. For some, rsync may not be available. So if you have the ability to use shell on your website, your web provider should offer the use of SCP, or FTP commands from shell. I have a remote destination, which is know as a NAS (Network-attached storage), setup inside my home. Though we don’t need to go to this level. Obtaining a backup can be as simple as writing important data to a CD, flash-drive, or a separate internal/external hard drive. - Windows If you’re running Windows, backups can be just as simple to maintain. I highly suggest using WinSCP, so once you have it downloaded please reference this link to determine how you can accomplish an automated backup to your Windows system: http://winscp.net/eng/docs/guide_automation - cPanel cPanel has been doing a wonderful job of creating a stable, scalable, and dynamic frontend solution for LAMP servers for many years. A few features should be noted. R1soft backups. If you see this feature available through cPanel, you may already have a queue of backups to access. With the current release for cPanel, it does not address database backups. You can determine a method to obtain a backup for you database in the ‘Infrastructure Data’ section of this article. Once you start running a database dump into a portion of your users directory, you should start seeing the data saved with your R1soft snapshot. The Backup feature, or using the Backup Wizard will yield a backup of the requested information. This is only a one time thing, and each backup through this feature will need to be obtained manually. This following article should provide a bit more information on the subject: http://docs.cpanel.net/twiki/bin/view/AllDocumentation/CpanelDocs/BackupWizard If you don’t see these features listed, It’s advisable to open a ticket with support to determine why the Backup feature isn’t unavailable. R1soft only offers backups through licensed providers. Infrastructure Data: Your website likely uses MySQL for database storage. With cPanel, you can create a MySQL backup via the ‘Backup’, or ‘Backup Wizard’ feature. It’s only a one time thing though, so you will need to manually download every backup via this feature. You may want to go to the next level, automation! Remember what we discussed about cronjobs? Well you can dump all current MySQL data into the directory you’re backing up. For example:
mysqldump —opt -u [uname] -p[pass] [dbname] > [backupfile.sql]Example:
mysqldump —opt -u kevin -p123password kevin_database > /home/kevin/documents/kevin_database.sqlJust add that to the end of your cron. You can create crons in cPanel, or live via shell within your crontab. The full command (cron included) should look like this:
30 * * * * mysqldump —opt -u kevin -p123password kevin_database > /home/kevin/documents/kevin_database.sqlSo, that’s it? At this point you should have a better idea of how to adopt a simple backup process. There are many more alternatives available for use. As discussed earlier, most popular CMS (Content Management Systems) provide plugins that create backups for you. It’s important to secure the backups to a system you trust. An additional solution would be to pay a third party to maintain backups for you. It can avoid a headache, and would take some stress of you. But who says you can’t keep your own copy? :) Special thanks to Kevin B. for letting us use this great tutorial!