Blog Archives

Netflix Ratings Import/Export

Andrle and I each used to have our own Netflix accounts. For a time, after we got married, we kept both; mine was used for disc rentals, and hers for streaming. Eventually we realized we were mostly just streaming, so we canceled my account and went to a single shared account. Sadly, this meant I lost over 1400 movie ratings, since Netflix provides no official way to export. After a while, I gave up on ever getting my ratings back.

Finally, earlier this month, Netflix embraced their role as a household service and added profiles. I decided to pay for a month of streaming and reactivate my old account to see if I could get my ratings out. I found this browser script which, after applying a patch described there, gave me a JSON file containing all of my ratings. This is also a useful backup to have in case Netflix ever goes away (unlikely).

Unfortunately there is still no way to easily import ratings into Netflix, so I wrote a very basic Chrome extension that would read the exported JSON file and click through each movie, rating it. It’s available on GitHub. Make sure to read the instructions included with the code; it’s straightforward but requires some poweruser comfort to follow the steps, since I didn’t bother with an interface. (Incidentally, this is one thing I love about having programming skills – that sense of having more power and control over my own data.)

Obviously it would be nice if various online services practiced across-the-board data liberation (though with Facebook and Twitter adding export, it’s getting better), but hopefully this is one tiny step in helping other nerdy family units transfer their precious Netflix ratings.

Posted in Code Projects, How-Tos Tagged with: , , , ,

A Better git-svn Log

This morning, Devon tweeted a great git tip that shows how to create a git alias that produces more readable git logs. I decided I wanted to set this up.

At work, we use a central Subversion repository, but a number of us use git-svn because we prefer git’s various local branch tools for development. I decided that it would be useful to extend this alias to also include the Subversion revision number if the commit has one. Unfortunately, this information (in the form of a Subversion URL) is stored in the commit body by git-svn, which may include other notes added by a developer. git-log exposes this text via the %b format specifier, but since we want to do some post-processing to extract just the revision number, we’ll need to set up a shell alias.

Here’s the final version I’ve added to my ~/.aliases:

alias glog='git log --graph --pretty=format:'"'"'%Cred%h%Creset%C(yellow)%d%Creset %s %Cgreen(%cr) %C(bold blue)<%an>%Creset%Cred%b%Creset'"'"' | perl -e '"'"'$log = do { local $/; <> }; $log =~ s/(>\e\[m\e\[31m)([^\n]*\n\| )*git-svn-id: svn\+ssh:\/\/[^\s\@]*\@(\d+) [0-9A-F-]+\n(\|| ) (\e\[m)/$1 r$3$5/gm; print "$log\n";'"'"' | less -RS'

First you’ll note the weird quote escaping – we want the alias to be single-quoted (no expansion), but we want the arguments to the commands to also be single-quoted. That’s where the '"'"' trick comes in; see this explanation on StackOverflow. The format string is almost identical to Filipe’s solution, except for the addition of the body. The nasty Perl regular expression pulls out the revision number from that while also preserving the ANSI color escape sequences I inserted into the format string. Finally, we pass everything back to less for paging; -R makes sure it interprets the colors correctly, while -S disables line-wrapping.

One unfortunate side effect of adding the post-processing is that you can’t pass the -p option to git log anymore, because Perl needs to read the whole log in for multiline matching (in case a developer had literal newlines in their commit body), and the diffs are large and could get caught in the regex filter.

Hopefully you find this useful!

Posted in Computers, How-Tos Tagged with: , , , , , ,

Instagram-negative

If you were on the media social much shortly before Christmas, you no doubt heard quite the kerfuffle about changes to Instagram’s Terms of Service that would take effect on January 16th, 2013, in part related to Facebook’s purchase of the service earlier last year. While much of this response was overblown, and based on misunderstandings of the relevant legalese, and Instagram later apologized and canceled some of the changes, the folderol was a reminder to me that you can’t really trust a service you’re not paying for for hosting your content under the license you want. Thus, I quit.

(more…)

Posted in Code Projects, How-Tos, meta, Opinion Tagged with: , , ,

Baby Got Backup

A Mind Changed

About two years ago, after a primary hard drive failure, I wrote a long post on what I called the “backup tripod”. As I said then, the tripod consists of three classes of regularly performed backup: offsite clones, incremental local backups, and cloud backups. For me, that consisted of Carbon Copy Cloner, Time Machine, and a few files on Dropbox on an as-needed basis, respectively. I think that the trade-offs of each leg of the tripod make for a complete solution, so you need all of them.

At the time, I was pretty strongly against any sort of comprehensive cloud backup solution. I was suspect of their security practices, and disliked the lack of control. I think this was in part inspired by bad experiences with Windows backup products for my work desktop, as well as the sketchy TV commercials from companies like Mozy. I’ve since changed my mind, convinced in large part by The Wirecutter‘s selection of Crashplan as their Best Online Backup Service. Why do I trust their review? A combination of Daring Fireball‘s recommendation, and the fact that we are very happy with the Omega 8300 juicer that they recommended. Crashplan’s own website also talks about security in a knowledgeable way, which makes me think they’re engaging in best practices.

Another factor is that due to a little life change (marriage to Andrle!), I’m now backing up more than just my own computer. Crashplan’s family plan takes care of that, for her 11″ MacBook Air, since we don’t currently run OS X Server or own a Time Capsule. While we have an offsite clone of her laptop, a cloud backup service can capture more recent changes. Unfortunately we’re not always on top of updating those offsite clones, so they can lag behind by a month or more, which is just asking for data loss.

The Third Leg

At first glance, it might seem that an online backup service is all you need; it provides a combination of the benefits of an offsite backup and incremental backups. It is both complete and updated regularly. However, while I’m sure Crashplan would be quite happy if they were your only backup provider, I maintain that you still need all three. Each of the other two legs still has an advantage: speed of recovery and frequency of increment.

My initial Crashplan backup (roughly 450 GB) took almost two weeks. Obviously part of that problem is the ridiculous imbalance in downstream vs. upstream bandwidth with almost all ISPs, including Comcast. Downloading to restore from backup wouldn’t be quite that slow, but could easily be a couple of days after a catastrophic failure. The second problem is that I’d need some working OS install with which to run the Crashplan client. These two downsides are both solved by the first leg, a bootable offsite clone. I store mine in a locked drawer in my locked office in a locked building at work, which is only a 20 minute round trip by bicycle if I had a data emergency. Crashplan does offer seeded backups and restores, but it’s well over $150 which at best gets it to you on a hard drive the next day. Although this wouldn’t be a problem for me, it’s also limited to a 1 TB USB drive, which for some people probably isn’t nearly enough.

While Crashplan does promise regular backups, that can’t always help you back off a very recent set of changes, or an accidental deletion right after making changes. That’s where the combination of OS X Lion’s Versions feature and Time Machine can save you on a per-file basis (and, as before, Dropbox may be useful).

Speaking of Dropbox, it has an additional role for me these days: synchronization for iOS apps. iCloud is still a pretty nascent feature, and for me only really helps moving data within one app between my iPhone and iPad. Dropbox, because it’s not post-filesystem, provides some combination of cloud backup and cloud sync for files across all of my computers and devices. Probably the most important example of this is my 1Password keychain, synced between my home and work desktops and my iOS devices, which is absolutely critical since I use generated passwords everywhere. I also use it to move files into GoodReader on my iPad for work, and to export files from Textastic, which I occasionally use for code editing. (I should add that my iDevices are backed up to my computer, not iCloud.)

Overall I think Crashplan will fit in fairly seamlessly to my current backup solutions. Obviously the hope is that I’ll never need it!

Social Media Exports

One quick aside that’s related to backups generally, but not so much to backing up your computer: update a local copy of your social network data periodically. At least for me, I generate a lot of content on Facebook and Twitter, and I would be sad to lose it. A data loss on their end is probably less likely than some kind of major service change, but either of them could make it impossible to access your old posts.

Somewhat ironically, Facebook makes this a lot easier than Twitter; just go to Account Settings and click the tiny “Download a copy” link at the bottom of the page. Unfortunately Twitter doesn’t provide API access to anything but your 2000 most recent tweets; I hope this changes at some point. I have all of my tweets being archived to Pinboard, but that only goes back to May 2011, missing over two years of usage.

Conclusion

Cloud backups now have my full blessing as part of the Backup Tripod. Crashplan seems to be the best service for the cost, and fits in well with the other legs of the tripod. The steps I’ve suggested in these two posts are fairly easy (albeit Mac-centric); at this point I think very few people have an excuse for losing data due to not having backups. I can’t reiterate this enough: back up your data, or someday you’ll get burned. Help your less-skilled family members set up automated backups that they don’t have to worry about.

Increasingly our lives are recorded almost entirely in bits. While there are many advantages to this, too numerous to go into here, they are inherently much more ephemeral things. Regular backups are one of the best ways to avoid complete data loss, although it may need to be coupled with occasional conversions in order to avoid file format rot. Make sure you have a plan to protect all of your data, on your own devices and in the cloud.

Posted in Computers, How-Tos Tagged with: , , , , , , , , , , ,

Back That Thing Up

Introduction

Three Sundays ago, my primary Mac OS X hard drive failed. Those of you who follow me on Twitter got somewhat of a play-by-play as I discovered the depth of my drive failure I got home to the Spinning Pinwheel of Death (SPOD), and discovered quickly that my computer would not wake from screensaver or boot. However, I didn’t panic. Why? Because I have what I believe to be a relatively robust backup system for home use.

I can’t stress enough how important regular backups are. Data loss is one of my personal nightmares (well, that, and Lego or Andrle loss), since most of my life (professional and personal) is on the computer. Among other things, I’d lose every picture I’ve ever taken since freshman year of college, every homework assignment I’ve written on the computer since late 6th grade (when we got our first Mac), not to mention substantial configuration work and those precious saved games.

I sit atop what I call the Backup Tripod: regular clones to an external disk stored off-site, hourly incremental backups to a local disk or local network storage, and as-needed on-save synchronization to cloud storage. I’m sure there are many other articles out there that recommend a particular strategy, but this is my solution for Macs. I even convinced my parental units to use a similar setup. I’ll go into detail on what solutions I use and why (as well as recovery strategy) for each below the cut.

I can’t emphasize enough how important data backup is for the typical modern power user.

(more…)

Posted in Computers, How-Tos Tagged with: , , , , , , , ,

Möbius Bagel

Via JWZ’s LiveJournal I found a method for slicing a bagel into two linked halves. I decided to try it. Video below the cut.

(more…)

Posted in food, How-Tos Tagged with: , , , , ,

Satisfy MacPorts Dependencies Locally

Introduction

Like many Unix geeks, I have software installed that I’ve built manually from source. A good example is my post on compiling django; a number of the relevant dependencies were built in /usr/local/src/ and installed in /usr/local/. I also like using package managers, because if I’m not doing any customization (and the package is common and not hard-to-find), I want to just get the latest version and slap it in the right place. The conflict between the two methodologies is when a managed package depends on software that is already installed on your system, either part of the default configuration (OS X ships with a fair bit of Unixy software included, especially if you install the Dev Tools, although not always a “standard” or particularly recent version) or custom-built.

I recently dumped Fink for MacPorts; while I’ve used Fink for a long time, since an early version was available for Mac OS 10.2 Jaguar in fact, it’s just gotten in a messy state maintenance-wise. I’ve been familiar with apt since using Debian-based systems at the SCCS, but the mish-mash of binary and source items, the preponderance of out-of-date packages, and the apparent need to install 70 metric boatloads of GNOME just to satisfy a few dependencies was frustrating. Of course, MacPorts has its own weaknesses, as do almost all package managers; in particular, none of them seem to be able to track whether a package was installed explicitly by the user or merely to satisfy a dependency. My opinion is that the latter should get uninstalled when all of its dependents are uninstalled, but no package manager seems to agree with me on that. A rant on that probably merits a separate post.

Below the cut is a rough step-by-step guide to creating a local portindex and creating portfiles for your manual dependencies. Note that most MacPorts users would tell you this is a terrible idea, and you should just install all the port dependencies, but I already put the effort into these custom from-source builds and I just want to use them without duplicates getting dropped all over my hard drive.

(more…)

Posted in Computers, How-Tos Tagged with: , , , , ,

Nostromo Keybindings for WoW

Introduction

As you may have gathered, I have a… healthy… relationship with everyone’s favorite MMO, World of Warcraft. I forget who originally planted the idea in my head (there’s a good chance it was Lilboo, formerly of the Daring Blades on Kirin Tor), but I decided that I wanted a dedicated game controller that was more than just the keyboard; there’s just too much going on in WoW for an FPS-like control layout, in my opinion. I settled on a Nostromo, and after a few weeks of adjusting, and very few changes to my bindings, I have gotten very used to what may be a very unusual control style.

Verbose explanation of how I use the device below the cut.

(more…)

Posted in Computers, How-Tos, Reviews Tagged with: , , ,

Amish Cinnamon Bread

Introduction

A coworker of mine gave me some live yeast bread starter two weekends ago, and I have proceeded to actually bake it into bread. This is, believe me, quite out of character. I am posting this recipe here (which, as far as I know, is relatively useless without the culture) both for my records and so I have a place to point people who need the recipe and may not want a printed copy, perhaps because they are, like me, allergic to paper.

If you live in Boston, I’m happy to provide you with some starter for free (I hear it’s like a cult); I’ll have 3-4 become available every week and a half or so.

I don’t know the origin of the recipe; Liz gave me a photocopied sheet with no authorship information. I’ll try to find out. I’ve made a few minor edits for my own clarification.

I’m also open to suggestions on a means of distribution other than plastic gallon ziploc bags; while they have the advantage of being air tight, and clearly indicating when the bag needs to be squeezed, it seems like a waste of plastic. I am reusing the ones I’ve received, and the ones I’m keeping for my own permanent starter, but it seems like there might be a better way.

Caveats

  • Do NOT refrigerate the mixture (this will kill the yeast)
  • Air formation in the bag is normal (a byproduct of fermentation)
  • “Squeeze” means let the air out, and mix the starter a bit by squishing

Schedule

For each day, from the date marked on the starter, do the listed step:

  1. Nothing
  2. Squeeze
  3. Squeeze
  4. Squeeze
  5. Squeeze
  6. Add 1 cup flour, 1 cup sugar, and 1 cup milk. Squeeze.
  7. Squeeze
  8. Squeeze
  9. Squeeze
  10. Bake!

Baking

In a large bowl, combine the batter with 1 cup flour, 1 cup sugar, and 1 cup milk. Mix.

Partition 1 cup of starter into four 1 gallon ziploc bags. Pass along to friends and family, along with a copy of these instructions or a link to this blog post: http://blog.ultranurd.net/2009/03/15/amish-cinnamon-bread/

Add to the remaining batter in the bowl:

  • 2 cups flour
  • 1 1/2 tsp baking powder
  • 1/2 tsp baking soda
  • 1/2 tsp salt
  • 1 tsp cinnamon
  • 2 small boxes instant vanilla pudding
  • 1 cup sugar
  • 1 cup oil
  • 1 tsp vanilla extract
  • 3 large eggs
  • 1/2 cup milk

Mix well.

In another bowl, mix 1 tsp cinnamon and 2 tsp sugar (or just use cinnamon-sugar if you have it). Sprinkle this into the bottom of two well-greased bread pans, then add the batter.

Bake the loaves at 325° for 1 hour, or until a toothpick inserted into the center comes out clean.

Posted in How-Tos Tagged with: , , , ,

Compiling Django with Twitter support as a Mac OS X Universal Binary

Introduction

This post is a guide for building your own version of Apache’s mod_python as a Universal Binary in order to support a custom Django install containing the Twitter libraries. As you can probably gather, this information is likely only useful to advanced Mac users who are comfortable in Terminal with compiling and installing software from source. If you’re still interested, gird your loins, crack your knuckles, grab some Mountain Dew, and read on.

Mac OS X 10.5 “Leopard” is yet another step forward into the world of 64-bit. At the same time, Apple has to support both PowerPC and Intel architectures. This is no mean feat, and this is where “fat” or Universal binaries come in.  Apple also has an explanation of Universal binaries, although it’s heavy on PR. This is all well and good, but there is one problem: once you make this leap, all of your library dependencies must contain the architecture you’re running as. Much software is still built as 32-bit only; while it may be a “fat” binary, containing both Intel and PowerPC machine code, it only has the 32-bit versions thereof. For reference, the names of the various architecture flags:

  32-bit 64-bit
Intel i386 x86_64
PowerPC ppc7400 ppc64

Huzzah naming conventions! There’s a lot of history in those names. I’ve linked to the relevant Wikipedia articles if you’re curious; these flags will be coming up again later when configuring various builds. The main thing to note is that most build configurations default to i386 on Intel Macs (even though Core 2 and Xeon processors are natively 64-bit), probably because most software is developed for 32-bit versions of Windows and Linux. As you’ll see, we’ll be overriding that default in several places to get this whole mess working.

Unfortunately, Universality is a cancer, which in my case starts with the Apple-shipped version of the Apache web server in 10.5, a universal binary. Everything it touches needs to be Universal as well, so that Apache can run as a 64-bit process by default. I wanted to add Django support on my web server via mod_python, specifically to play with the Twitter API, which meant I also needed to build python-twitter and its dependencies, as well as a MySQL python module to allow Django to talk to my database. None of these are included in the default Leopard version of Python 2.5.1.

After getting all of this set up, and trying to start my test Django app, mod_python was giving me errors about architecture. As it turns out, the included version of Python is only a “fat” 32-bit binary, not a Universal binary… which means all of the new Python modules I just compiled to support Twitter and Django were only 32-bit, which in turn means that the included Universal version of Apache and mod_python couldn’t use them. Yay.

Below the cut you’ll find my complete instructions for compiling all of the relevant components and their dependencies. I also took the opportunity to update to the latest release version of Python 2.6 and MySQL 5.1, and as a side effect my database server is now running as a 64-bit process. Progress has been made here. Feel free to comment or contact me if you have questions.

(more…)

Posted in How-Tos Tagged with: , , , , , , , ,

Nicolas Ward

Software engineer in Natural Language Processing research by day; gamer, reader, and aspiring UltraNurd by night. Husband to Andrle
Creative Commons License

Post History

August 2017
S M T W T F S
« Jul    
 12345
6789101112
13141516171819
20212223242526
2728293031