Download only took about 20 minutes. Not too shabby.
About two years ago, after a primary hard drive failure, I wrote a long post on what I called the “backup tripod”. As I said then, the tripod consists of three classes of regularly performed backup: offsite clones, incremental local backups, and cloud backups. For me, that consisted of Carbon Copy Cloner, Time Machine, and a few files on Dropbox on an as-needed basis, respectively. I think that the trade-offs of each leg of the tripod make for a complete solution, so you need all of them.
At the time, I was pretty strongly against any sort of comprehensive cloud backup solution. I was suspect of their security practices, and disliked the lack of control. I think this was in part inspired by bad experiences with Windows backup products for my work desktop, as well as the sketchy TV commercials from companies like Mozy. I’ve since changed my mind, convinced in large part by The Wirecutter‘s selection of Crashplan as their Best Online Backup Service. Why do I trust their review? A combination of Daring Fireball‘s recommendation, and the fact that we are very happy with the Omega 8300 juicer that they recommended. Crashplan’s own website also talks about security in a knowledgeable way, which makes me think they’re engaging in best practices.
Another factor is that due to a little life change (marriage to Andrle!), I’m now backing up more than just my own computer. Crashplan’s family plan takes care of that, for her 11″ MacBook Air, since we don’t currently run OS X Server or own a Time Capsule. While we have an offsite clone of her laptop, a cloud backup service can capture more recent changes. Unfortunately we’re not always on top of updating those offsite clones, so they can lag behind by a month or more, which is just asking for data loss.
At first glance, it might seem that an online backup service is all you need; it provides a combination of the benefits of an offsite backup and incremental backups. It is both complete and updated regularly. However, while I’m sure Crashplan would be quite happy if they were your only backup provider, I maintain that you still need all three. Each of the other two legs still has an advantage: speed of recovery and frequency of increment.
My initial Crashplan backup (roughly 450 GB) took almost two weeks. Obviously part of that problem is the ridiculous imbalance in downstream vs. upstream bandwidth with almost all ISPs, including Comcast. Downloading to restore from backup wouldn’t be quite that slow, but could easily be a couple of days after a catastrophic failure. The second problem is that I’d need some working OS install with which to run the Crashplan client. These two downsides are both solved by the first leg, a bootable offsite clone. I store mine in a locked drawer in my locked office in a locked building at work, which is only a 20 minute round trip by bicycle if I had a data emergency. Crashplan does offer seeded backups and restores, but it’s well over $150 which at best gets it to you on a hard drive the next day. Although this wouldn’t be a problem for me, it’s also limited to a 1 TB USB drive, which for some people probably isn’t nearly enough.
While Crashplan does promise regular backups, that can’t always help you back off a very recent set of changes, or an accidental deletion right after making changes. That’s where the combination of OS X Lion’s Versions feature and Time Machine can save you on a per-file basis (and, as before, Dropbox may be useful).
Speaking of Dropbox, it has an additional role for me these days: synchronization for iOS apps. iCloud is still a pretty nascent feature, and for me only really helps moving data within one app between my iPhone and iPad. Dropbox, because it’s not post-filesystem, provides some combination of cloud backup and cloud sync for files across all of my computers and devices. Probably the most important example of this is my 1Password keychain, synced between my home and work desktops and my iOS devices, which is absolutely critical since I use generated passwords everywhere. I also use it to move files into GoodReader on my iPad for work, and to export files from Textastic, which I occasionally use for code editing. (I should add that my iDevices are backed up to my computer, not iCloud.)
Overall I think Crashplan will fit in fairly seamlessly to my current backup solutions. Obviously the hope is that I’ll never need it!
One quick aside that’s related to backups generally, but not so much to backing up your computer: update a local copy of your social network data periodically. At least for me, I generate a lot of content on Facebook and Twitter, and I would be sad to lose it. A data loss on their end is probably less likely than some kind of major service change, but either of them could make it impossible to access your old posts.
Somewhat ironically, Facebook makes this a lot easier than Twitter; just go to Account Settings and click the tiny “Download a copy” link at the bottom of the page. Unfortunately Twitter doesn’t provide API access to anything but your 2000 most recent tweets; I hope this changes at some point. I have all of my tweets being archived to Pinboard, but that only goes back to May 2011, missing over two years of usage.
Cloud backups now have my full blessing as part of the Backup Tripod. Crashplan seems to be the best service for the cost, and fits in well with the other legs of the tripod. The steps I’ve suggested in these two posts are fairly easy (albeit Mac-centric); at this point I think very few people have an excuse for losing data due to not having backups. I can’t reiterate this enough: back up your data, or someday you’ll get burned. Help your less-skilled family members set up automated backups that they don’t have to worry about.
Increasingly our lives are recorded almost entirely in bits. While there are many advantages to this, too numerous to go into here, they are inherently much more ephemeral things. Regular backups are one of the best ways to avoid complete data loss, although it may need to be coupled with occasional conversions in order to avoid file format rot. Make sure you have a plan to protect all of your data, on your own devices and in the cloud.
This post is a guide for building your own version of Apache’s mod_python as a Universal Binary in order to support a custom Django install containing the Twitter libraries. As you can probably gather, this information is likely only useful to advanced Mac users who are comfortable in Terminal with compiling and installing software from source. If you’re still interested, gird your loins, crack your knuckles, grab some Mountain Dew, and read on.
Mac OS X 10.5 “Leopard” is yet another step forward into the world of 64-bit. At the same time, Apple has to support both PowerPC and Intel architectures. This is no mean feat, and this is where “fat” or Universal binaries come in. Apple also has an explanation of Universal binaries, although it’s heavy on PR. This is all well and good, but there is one problem: once you make this leap, all of your library dependencies must contain the architecture you’re running as. Much software is still built as 32-bit only; while it may be a “fat” binary, containing both Intel and PowerPC machine code, it only has the 32-bit versions thereof. For reference, the names of the various architecture flags:
Huzzah naming conventions! There’s a lot of history in those names. I’ve linked to the relevant Wikipedia articles if you’re curious; these flags will be coming up again later when configuring various builds. The main thing to note is that most build configurations default to i386 on Intel Macs (even though Core 2 and Xeon processors are natively 64-bit), probably because most software is developed for 32-bit versions of Windows and Linux. As you’ll see, we’ll be overriding that default in several places to get this whole mess working.
Unfortunately, Universality is a cancer, which in my case starts with the Apple-shipped version of the Apache web server in 10.5, a universal binary. Everything it touches needs to be Universal as well, so that Apache can run as a 64-bit process by default. I wanted to add Django support on my web server via mod_python, specifically to play with the Twitter API, which meant I also needed to build python-twitter and its dependencies, as well as a MySQL python module to allow Django to talk to my database. None of these are included in the default Leopard version of Python 2.5.1.
After getting all of this set up, and trying to start my test Django app, mod_python was giving me errors about architecture. As it turns out, the included version of Python is only a “fat” 32-bit binary, not a Universal binary… which means all of the new Python modules I just compiled to support Twitter and Django were only 32-bit, which in turn means that the included Universal version of Apache and mod_python couldn’t use them. Yay.
Below the cut you’ll find my complete instructions for compiling all of the relevant components and their dependencies. I also took the opportunity to update to the latest release version of Python 2.6 and MySQL 5.1, and as a side effect my database server is now running as a 64-bit process. Progress has been made here. Feel free to comment or contact me if you have questions.