When I began writing my book, the first thing I did was to think of a backup strategy for my manuscript. I probably went a little overboard with it, but then again I wasn't going to lose > 300 pages in a power-surge. My PC has two internal hard drives, so the first thing I did was to keep copies on both of them. That way I had a backup that was very easy to access and to restore. I kept yet another copy on my external hard drive which gave me another backup that was easy to access but that wouldn't be lost if my PC decided to burst into flames. Note that all the copies I've talked about so far were on the same type of medium and in the same location. The only protection that gave me was against human error (which is a considerable factor). The next thing I did was to keep a copy on a USB stick that I would carry around with me wherever I went. This gave me a new type of medium and at least occasionally a different location. Since I was the only one working on the manuscript, carrying the USB stick around with me actually made a lot of sense: As long as I would be okay, the USB stick would okay and in the unlikely event that I would spontaneously combust, the well-being of my USB stick would probably be the least of my concerns. I didn't leave it at that though. I kept yet another copy on an FTP server hosted not only outside of my house but in a different federal state. That gave me a location that was far enough from where I was that I wouldn't have to worry about forest fires, floods or the living dead. The last thing I did, and I'm starting to sound a little bit obsessive right now, was to once a week burn my manuscript onto a CD and keep those CDs in various places.
Did I ever have to restore one of my backup copies? Well, no. But I could have. As a matter of fact, I regularly tested my backups to see if they actually worked. That meant downloading the copies from the FTP, checking the copy on the USB stick and on the CDs as well as on all the hard disks. If you're not testing your backups, you might as well make none at all.
So you're probably thinking that I must've spent more time making backup copies than actually working on the book, but as a matter of fact, all my daily backups were completely automated. All I had to do to backup my work was to double-click a shortcut on my desktop. The shortcut would then run a Windows batch file that would do all the copying as well as the FTP upload. Here's an extract of that batch file:
Click anywhere in the code to remove line numbers.
1 @echo off
2 mkdir "d:\Backups\book\%date%\"
3 del /Q /F "d:\Backups\book\%date%\*.bak"
4 ren "d:\Backups\book\%date%\*.doc?" *.bak
5 copy *.doc? "d:\Backups\%date%\book"
For my local backups, I made a new backup folder every day. This gave me very basic versioning and allowed me to recover things I had overwritten in newer revisions. The %date% variable that makes this happen (in conjunction with the mkdir command) is available since Windows XP and contains the current date in a format specific to your system's locale. This mechanism also offers some protection against file corruption: Imagine you click on File -> Save and your program tells you it has saved your document, but in reality it has produced a file that it will never be able to open again. Then you run your sophisticated backup batch-file and it overwrites your only backup copy with that corrupted file. Sounds awful, doesn't it? Well, that wasn't going to happen to me but I also didn't want to lose all my progress for that day. So I added another layer of protection by always "backing up" the most recent backup copy. Put another way, I rotated between backup copies by renaming the previous copy to ".bak".
For my online backups, I used a handy little program called "ftp.exe" that ships with windows. Ftp.exe for the most part is an interactive console application and as such wouldn't be particularly useful in an automation-context. It allows you however, to pass to it a text file that contains all the commands you want it to execute. Now all you have to do is have your batch file create that text file on the fly and pass it to the application:
Click anywhere in the code to remove line numbers.
1 echo user [USERNAME]> ftp.txt
2 echo [PASSWORD]>> ftp.txt
3 echo bin>>ftp.txt
4 for %%i in (*.doc?) do echo put "%%i">>ftp.txt
5 echo quit>>ftp.txt
6 ftp -n -s:ftp.txt [HOST]
7 del ftp.txt
The first three lines are used to write the login commands to the text file and to set the transfer mode to "binary". The next line loops over all the Word documents in the current directory ("doc?" means files ending in "doc" or "doc" and another character. This accounts for Office 2007's new "docx" file extension) and writes their file names preceded by the "put" command into the text file. Finally the "quit" command is used to disconnect from the FTP server. Now all you have to do is run ftp.exe and pass to it the name of the text file.
I guess what I'm trying to illustrate is that you can create a fully functional backup solution in less than 20 lines of batch code and what that means is that there's really no reason why you shouldn't backup your stuff. Of course you don't have to use my do-it-yourself backup solution. There's much better software for that. What's really important though is that you do make backups. Your photos, your emails, your contacts and of course your code, all that stuff is well worth preserving. One day, your hard disk will fail and a backup will mean the difference between a shitty day and an extremely shitty day.