Backbonification: migrating a large JavaScript project from DOM spaghetti to Backbone.js

We’ve all done it. Our code base has one huge monolithic file, packed full of JavaScript spaghetti. It’s unwieldy, hard-to-debug, and has little to no separation of concerns. It is a nightmare to bring new engineers up to speed.

This blog post is about decomposing NewsBlur’s single-file 8,500 line JavaScript application into its component parts: 8 models, 12 views, 3 routers, 3 collections. This post explores patterns, techniques, and common pitfalls in migrating from vanilla JavaScript to Backbone.js. It covers moving routers, models, and views, and the process used to migrate a living app.

NewsBlur is a free RSS feed reader and is open-source. The benefit of being open-source is that you can see all of the changes I made in this migration by looking through the commit history.

As a bit of background, I worked on Backbone.js in its infancy, when Jeremy Ashkenas and I worked on DocumentCloud’s many open-source projects.

The Presentation

This post was written concurrently with a presentation. Depending on your style, you can either read on or flip through this deck. Both have the same content, but this post expands on every concept in the presentation.

There’s no need to go through the presentation. Just read on for the whole kaboodle.

Pre-reqs: Libraries

There are only two libraries you need to be intimately familiar with in order to make the most of your Backbone transition: Underscore.js and Backbone.js. That means not only being comfortable with reading the source code of these two libraries, but also knowing all of the methods exposed so you can reach into your grab-bag of tricks and pull out the appropriate function.

Underscore.js

Underscore.js is another DocumentCloud library that makes your code more readable and compact by providing useful functions that massage, filter, and jumble data.

One popular use of Underscore is creating short pipelines that take a large collection of models and filters it based on conditions. That much is easy. But there are other uses that are beneficial to know.

You should be comfortable with all enumerable methods. Think about all of your model collections as reduce-able, filterable, and selectable.

Here are two examples of Underscore.js at work:

// Get ids of all active feeds
_.pluck(this.feeds.select(function(feed) {
    return feed.get('active');
}), 'id');
// Returns: [42, 36, 72, ...]

// Count fetched/unfetched feeds
var counts = this.feeds.reduce(function(counts, feed) {
    if (feed.get('active')) {
        if (!feed.get('not_yet_fetched') || feed.get('has_exception')) {
            counts['fetched_feeds'] += 1;
        } else {
            counts['unfetched_feeds'] += 1;
        }
    }
    return counts;
}, {
    'unfetched_feeds': 0,
    'fetched_feeds': 0
});
// Returns: {'unfetched_feeds': 3, 'fetched_feeds': 42}

Backbone.js

The star of the show is Backbone.js. The entire backbone.js file is fewer than 1,500 lines long, and that’s with 228/1478 lines of whitespace (15%) and 389/1478 lines of comments (26%).

This is a basic example of the layout of the four main classes: models, views, collections, and routers. A fifth meta-class called Events is mixed in to each of these classes.

How to start

The first step is no easy task. Take your existing design and visually decompose it into its component views. Each view will be represented by either a single model or a combination of models. In fact, you can even have a view not be backed by a model at all.

Take the NewsBlur UI for example. It’s a standard three-pane view, with feeds, stories, and story detail:

Notice that there are multiple views inside other views. Some views are meant to be simple wrappers around other, more functional views.

Continue reading Backbonification: migrating a large JavaScript project from DOM spaghetti to Backbone.js (6064 words)...

For Good Reason, NewsBlur Will Not Compete With the Big Boys

NewsBlur has and, for as far out as I can see, will be a side-project. It’s fun, but I can’t even begin to imagine the headaches I’d face if I had to support a living salary through NewsBlur. My goal is just to meet potential co-founders and to try to make a splash. And I’ve been meeting folks here in NYC who I would not have met otherwise. So it’s working quite well, so far. Fabulously well, in fact.

Raising money for something like this must be difficult and a huge crapshoot. There are a lot of fadish readers out there who have already sucked up most of the goodwill that the press is willing to give to this area, and even then, they have a design element that is extremely hard to compete with. Financial backers would much rather shut a site down than let it live a meager existence. They’re not out to support a site, they want a 10x return. Funding doesn’t make any sense for what I want.

Besides, now I can build fun features like social, the iPhone app, and river of news in peace. Who knows what markets I would have to chase if I had stakeholders. Enterprise! Aggregators! Ick.

I think any other business is a better business to be in, but who knows if my tune will change within the next 12 months, as I roll out bigger features.

Old-style Mac OS X Leopard Exposé in Snow Leopard

Progress is progress, except when it gets in the way of your workflow. Let’s compare these two screenshots:

Old-style Leopard Exposé

New-style Snow Leopard Exposé

Notice how much more pleasant the old-style Exposé is? Introduced in Mac OS X 10.3 Panther, and virtually unchanged until OS X 10.6 Snow Leopard, it featured proportional windows. By just looking at the size of the window relative to the other windows, you can get a fair idea of what the application is.

The proportional windows went out the window with the new Exposé. Now it features an inexplicable grid, with windows resized to all different dimensions relative to their original size.

Old-style Exposé in Snow Leopard

The great news is that you can get the old-school Exposé back. The beta builds of Snow Leopard included a new Dock.app that used the old-style exposé. By installing the old Dock.app, you get the new Dock features of Snow Leopard, while preserving the legendary Exposé.

Installation

  1. Download the Snow Leopard beta-build of Dock.app
  2. Save to your Desktop and unzip.

Run the following commands in Terminal.app:

#!sh
sudo chown -R root ~/Desktop/Dock.app;
sudo chgrp -R wheel ~/Desktop/Dock.app;
sudo killall Dock && \
sudo mv /System/Library/CoreServices/Dock.app ~/Desktop/OldDock.app && \
sudo mv ~/Desktop/Dock.app /System/Library/CoreServices/

Easy to do and indispensible now that you have it back. Hat-tip to miknos at MacRumors for the original find.

Note that you will have to repeat this process every time you upgrade your Mac OS to a new patch release (10.6.6 -> 10.6.7).

@samuelclay is on Twitter.

Use Google Reader? I built NewsBlur, a new feed reader with intelligence.

What Happened to NewsBlur: A Hacker News Effect Post-Mortem

Last week I submitted my project, NewsBlur, a feed reader with intelligence, to Hacker News. This was a big deal for me. For the entire 16 months that I have been working on the project, I was waiting for it to be Hacker News ready. It’s open-source on GitHub, so I also had the extra incentive to do it right.

And last week, after I had launched premium accounts and had just started polishing the classifiers, I felt it was time to show it off. I want to show you what the Hacker News effect has been on both my server and my project.

Hacker News As the Audience

When I wasn’t writing code on the subway every morning and evening, I would think about what the reaction on Hacker News would be. Would folks find NewsBlur too buggy? Would they be interested at all? Let me tell you, it’s a great motivator to have an audience in mind and to constantly channel them and ask their opinion. Is a big-ticket feature like Google Reader import necessary before it’s Hacker News ready? It would take time, and time was the only currency which I could pay with. In my mind, all I had to do was ask. (“Looks cool, but if there’s no easy way to migrate from Google Reader, this thing is dead in the water.”)

Kurt Vonnegut wrote: “Write to please just one person. If you open a window and make love to the world, so to speak, your story will get pneumonia.” (From Vonnegut’s Introduction to Bagombo Snuff Box.)

Let’s consider Hacker News as that “one person,” since for all intents, it is a single place. I wasn’t working to please every Google Reader user: the die-hards, the once-in-a-seasons, or the twitter-over-rss’ers. For the initial version, I just wanted to please Hacker News. I know this crowd from seeing how they react to any new startup. What’s the unique spin and what’s the good use of technology, they would ask. What could make it better and is it good enough for now?

If you’re outsourcing tech and just applying shiny visuals to your veneer, the Hacker News crowd sniffs it out faster than a beagle in a meat market. So I thought the best way to appeal to this crowd is to actually make decisions about the UI that would confuse a few people, but enormously please many people. From comments on the Hacker News thread, it looks like I didn’t wait too long.

Continue reading What Happened to NewsBlur: A Hacker News Effect Post-Mortem (1506 words)...

Migrating Django from MySQL to PostgreSQL the Easy Way

I recently moved NewsBlur from MySQL to PostgreSQL for a variety of reasons, but most of all I want to use connection pooling and database replication using Slony, and Postgres has a great track record and community. But all of my data was stored in MySQL and there is no super easy way to move from one database backend to another.

Luckily, since I was using the Django ORM, and with Django 1.2’s multi-db support, I can use Django’s serializers to move the data from MySQL’s format into JSON and then back into Postgres.

Unfortunately, If I were to use the command line, every single row of my models has to be loaded into memory. Issuing commands like this:

python manage.py dumpdata --natural --indent=4 feeds > feeds.json

would take a long, long time, and it wouldn’t even work since I don’t have even close to enough memory to make that work.

Luckily, the dumpdata and loaddata management commands are actually just wrappers on the internal serializers in Django. I decided to iterate through my models and grab 500 rows at a time, serialize them and then immediately de-serialize them (so Django could move from database to database without complaining).

import sys
from django.core import serializers

def migrate(model, size=500, start=0):
    count = model.objects.using('mysql').count()
    print "%s objects in model %s" % (count, model)
    for i in range(start, count, size):
        print i,
        sys.stdout.flush()
        original_data =  model.objects.using('mysql').all()[i:i+size]
        original_data_json = serializers.serialize("json", original_data)
        new_data = serializers.deserialize("json", original_data_json, 
                                           using='default')
        for n in new_data:
            n.save(using='default')

migrate(Feed)
Continue reading Migrating Django from MySQL to PostgreSQL the Easy Way (1041 words)...