I think both of your strategies described are good, while noting the issue that mouser mentioned of not having off-site. That being said, I do want to bring up a few issues which I think are responsible for many people *not* backing up (including myself for a long time). First, in regards to both of your strategies, I see complexity being a potential deterrent for the "average user". This ties in to my second point which is that often people spend a lot of time seeking and/or planning the perfect backup strategy before implementing. Doing this will only result in you not having a backup strategy for a long time, if ever, which is what happened to me. Having *any* backup system in place (that is *not* RAID, which is not a "backup" strategy), even an imperfect one, is better than none at all. In fact there is no perfect backup system. So focus first on just having *a* backup strategy, even if it's only a simple sync to an external drive (or an additional internal drive). You can improve your backup system easily, but waiting until you have the perfect system designed just puts your data at risk for longer.
Having made those points, here's my system in brief (molded by my own unique needs):
I have 1 "workstation" machine where I do most of my work. I have 1 "server" machine that holds all my media files and is connected to my TV/stereo, and from which I want access to some of my media files.
I sync all document files to my server from my workstation over the network; since this is done daily, the regular data transfer amount is not large and doesn't really become an issue even though it's only over 100mbit ethernet - I plan to upgrade to gigabit at some point. Syncing allows my server to have access to all of my workstation's files, particularly photos which I sometimes want to display on the TV. It also is a first line of defense.
Next I use
CrashPlan running on my workstation to do regular (at least daily, usually every 4 hours) backups both to a couple of external drives *and* to the "cloud" (remote data storage service). CrashPlan offers a "seed" service to get an initial large backup started by shipping a drive to you and back. Then you only need to send changes over the wire which are usually minimal and are compressed and encrypted. CrashPlan gives me off-site storage, some amount of versioning (enough for my needs), and local backup as well, all in one package. As I've written elsewhere I'm not entirely happy with CrashPlan due to its high memory use, but I now have 15GB of memory in this machine so it's largely a non-issue at this point. For others with large volumes of files it may be an issue.
The advantages I see in my system are that it's extremely low maintenance, both the sync program and the backup program run automatically without my intervention. The backup app notifies me regularly via email of backup status so I don't even really need to check it. If I don't get an email then I know something is up. I also have off-site backup without the hassle of actually taking a physical unit off-site regularly. Finally, I have direct access to all my files on my server as well as my workstation *with* local-file speeds (an issue when you're trying to enjoy slide shows of huge RAW images for example - gigabit networking might eliminate the need for this though).
The disadvantages are that I need to pieces of software, a sync system and a backup system. If CrashPlan could do both, I would be very happy and no longer mind the memory use. CrashPlan always backs up to its own proprietary file format however so this is not currently an option. The off-site backup also requires reasonable (though not extreme) outgoing bandwidth. I have a bonded ADSL2+ line from
Sonic.net that gives me a theoretical 44/2 connection (22/4 with AnnexM, which I do use) and in practice I get about 3mbit/s upload speeds, which is fine for backing up my data remotely. Most consumer cable connections provide at least 2mbit upload these days so for many people this is also an option. Bandwidth caps may be an issue for some (do they affect outgoing bandwidth, only incoming, or total bi-directional?).
I also don't have system images in my backup scheme but I don't really need them and find doing them to be more hassle than it's worth. I'm ok doing a fresh install of Windows if I happen to have a system melt down, it's likely in that event that my system itself may have died anyway, in which case a system image may not even be ideal (although I know some imaging tools allow you to "retarget" an image to new hardware).
In the future when I upgrade to gigabit (just the router needs the upgrade, both machines already have gbit ports of course), I may not continue to do the syncing. I feel that 1 local backup and 1 remote is probably good enough, and it would simplify my system.
Although I think mouser's approach with all internal drives for backup is ok, it's important to consider that simple household disasters can quickly destroy a single computer system, including all hardware in it (think plumbing leak on the floor above you hitting your computer while it's on and you're not home). Something as simple as backing up to an external drive (eSATA or USB 3 are plenty fast enough), or backing up to another system in the house over gigabit LAN, provides a reasonable level of redundancy without the potential hassle of off-site. Granted if your house burns down or there's a flood you're not protected, but it does address the likely more common smaller household disasters like the one I just mentioned. Another option to consider is an in-home fire and flood proof safe which can largely replace a proper off-site backup too.
I still intend to write a follow-up blog post to my original backup post that talks about all this in more detail.
- Oshyan