The New Flowdock was built from the ground up on top of Backbone.js. Since Flowdock is all about real-time messaging, our web app posts and receives messages via a Socket.io backend. To support saving messages through Socket.io instead of a REST API, we wrote a custom Backbone.sync method.

We also use Bacon.js a lot to manage message streams in the client. If you’re not familiar with Bacon.js, take a look at the primer we posted about Functional Reactive Programming and Bacon.js. The Bacon.js EventStreams are an important part of keeping our data models and collections up to date.

Introduction

About the terminology: Flowdock is a real-time communication tool, which means that the core of our service is delivering different kinds of messages. When I talk about messages in this blog post, you can think of them as chat messages that have a number of properties: content, user who sent it, the flow (chat room) it was posted to and tags, among others. Messages can also be emails, notifications about git commits or anything else we support. The exact format is not important, but rather the idea that messages represent most of the content that is eventually displayed to the user and can be filtered based on a number of factors like the flow it was sent to or the tags it has.

What’s this Backbone.sync method then? Backbone delegates every save or read operation to its own sync method. By default, it maps the different verbs (create, update, etc.) to CRUD REST API operations á la Rails. Backbone.sync takes three arguments:

  • method, which is any of ‘update’, ‘patch’, ‘create’, ‘read’ and ‘delete’
  • model, which is the model or collection to operate on
  • options.

The default implementation serializes the model or collection (if needed) and then makes an AJAX request to the URL specified by the model’s URL property. The model or collection gets updated after the operation completes.

Creating Messages Over Socket.io

So, our custom sync needs to post new messages to the backend through Socket.io and otherwise work normally.

Flowdock.socketIoSync = (method, model, options) ->

  # Create-operations get routed to Socket.io
  if method == 'create'
    Flowdock.connection.messages.push model.toJSON()

  # All other operations use standard Backbone.sync
  else
    Backbone.sync method, model, options

Flowdock.connection.messages is a Bacon.js Bus that forwards messages without an ID through Socket.io to our backend. The relevant parts of the connection class work as follows:

class Flowdock.Connection

  constructor:
    @messages = new Bacon.Bus()
    @socket = @connect() # Returns a Socket.io socket
    @messages.filter((message) -> !message.id).onValue (message) =>
      @socket.emit "message", message
  ...

We’ve intentionally broken Backbone.sync’ default behaviour of returning a promise and executing success and error callbacks since our implementation handles the state as part of the message stream. Basically, we keep track of the UUIDs of sent messages and check if they come back with an ID (aka “success callback”) and if not, we send an error message to the messages bus (aka “error callback”).

One important thing to note is that the message bus is the same one that receives new messages from Socket.io and which all of our collections listen to. When new messages get pushed to the bus all interested collections — and therefore the UI — can react immediately to the changes, even before the messages perform a roundtrip to our backend.

Only our message models require this kind of sync method, and Backbone enables us to choose the sync method to use per model:

class Models.Message extends Backbone.Model

  sync: Flowdock.socketIoSync
 
  ...

This allows us to simply create a new message, save it, and it’ll automatically go through Socket.io.

# Create a new message
message = new Models.Message(content: 'foo', flow: 'example:main', user: 1)
message.save()
# -> ['message', {
#   event: 'message',
#   content: 'foo',
#   user: 1,
#   flow: 'example:main',
#   uuid: 'abbaacdcABBAACDC'
# }] is emitted to socket

Updating Collections and Models

Part two of the whole synchronizing messages over Socket.io problem is receiving messages: new messages should end up in the correct collections and existing messages should get updated. You might already have wondered how the messages posted through the websocket get IDs and other server generated properties. The simple version of our solution is that we generate UUIDs for new messages in the client before saving them, after which the message collection updates the correct model based on the UUID when the updated message arrives back from the server through Socket.io.

Our collections (and some models) consume message streams and filter messages relevant to them. Usually the consumed stream is a filtered version of the global message bus. This way we can push all of the messages (both new and old) to the single message bus. We don’t need to know who’s listening to it and the message collections always stay up to date.

This is how the consuming a message stream works:

class Collections.Messages extends Backbone.Collection

  ...
 
  consume: (@stream) ->
    @stream.filter(@messageFilter).onValue (message) =>
      # Find message by UUID and update it if found
      existingMessage = @findByUuid(message.uuid)
      if message.id && existingMessage? && !existingMessage.id
        existingMessage.set(message)
        existingMessage.trigger("sync")

      # Add the message if it does not already exist
      else if !message.id || !@get(message.id)
        @add(message)

      return Bacon.more

  ...

This logic is used as new messages and updated data comes in, but we still use the collection.fetch method to load the initial data and history to collections from our REST API.

Conclusions

Backbone.js is not specifically built for real-time applications, so using it in one requires some modification. Backbone is, however, a really solid foundation to build on, and since it has strict conventions, it’s really easy to customize the behaviour.

Using Bacon.js as the interface to the websocket library decouples Backbone collections and models from the Socket.io part and gives us the possibility to swap out Socket.io, and replace it with something else (like Faye, sock.js or EventSource) if needed with minimal effort. None of the collections or models actually know anything about Socket.io, so wrapping another library in a similar Bacon.js interface is easy. The consume pattern also makes unit testing really easy, since we don’t need to mock the Socket.io connection.