Using Make with React and RequireJS

RequireJS is a great library for building modular JavaScript clients that conform to the AMD API specification. It gives your JS an import-like mechanism by which you avoid global-namespace issues and makes your code behave more like a server-side language such as Python or Java. It also includes an optimization tool that can concatenate your JavaScript into a single file and minify it, which helps reduce HTTP overhead.

I recently wrote about React, which is a library targeted at constructing UIs. React uses a special syntax called JSX for specifying DOM components in XML, and it compiles down to vanilla JavaScript. JSX can either be precompiled or compiled in the browser at runtime. The latter option has obvious performance implications and probably shouldn’t be used in production, but it works great for quickly hacking something together.

However, if you’re using RequireJS and opt to defer JSX compilation to the browser, you’ll have problems loading your JSX modules since they aren’t valid JavaScript. Fortunately, there are RequireJS plugins to work around this, such as require-jsx, which lets you simply do the following:

The require-jsx plugin just performs the compilation when a module is loaded.

The other option, as I hinted at, is to precompile your JSX. This offloads the JSX transformation and allows Require’s optimizer to minify your entire client. React has a set of complementary tools, aptly named react-tools, which includes a command-line utility for performing this compilation.

The jsx tool can also watch directories, doing the compilation whenever the source changes, with the `–watch` option.

Require now has no problem loading our React components since they are plain ole JavaScript—no special JSX plugins necessary. This also means we can easily hook in minification using Require’s r.js:

You can use whatever build tool you want to tie all these things together. I personally prefer Make because it’s simple and ubiquitous.

Running `make js` will install my Bower dependencies, perform JSX compilation, and then minify the client. This workflow works well and makes it easy to setup different build steps, such as pip installing Python requirements, running integration and unit tests, and performing deploys.

Building User Interfaces with React

If you follow me on Twitter, you’ve probably heard me raving about React. React is described as a “JavaScript library for building user interfaces” and was open sourced by Facebook about a year ago. Everybody and their mom has a JavaScript framework, so what makes React so interesting? Why would you use it over mainstays like Backbone or Angular?

There are a few things that make React worth looking at. First, React is a library, not a framework. It makes no assumptions about your frontend stack, and it plays nicely with existing codebases, regardless of the tech you’re using. This is great because you can use React incrementally for new or legacy code. Write your whole UI with it or use it for a single feature. All you need is a DOM node to mount.

React is delightfully simple (contrasted with Angular, which is a nightmare for beginners, and Backbone, which is relatively simple but still has several core concepts). It’s built around one idea: the Component, which is merely a reusable unit of UI. React was designed from the ground-up to be composable—Components are composed of other Components. Everything in the DOM is a Component, so your UI consists of a hierarchy of them.

Components can be built using JSX, an XML-like syntax that compiles down to regular JavaScript. As such, they can also be specified using plain-old JavaScript. The result is the same, but JSX makes it easy to visualize your DOM.

React does not do two-way data binding ((It’s worth noting that React provides a small add-on for getting the conciseness of two-way binding with the correctness of its one-way binding model.)). This is by design. It uses the von Neumann model of dataflow, which means data flows in only one direction. Two-way data binding makes it difficult to reason about your code. The advantage of the one-way model that React adopts is that it essentially turns your UI into a deterministic state machine. On the surface, it behaves as if the entire UI is simply re-rendered based on the current state of your data model. If you know what state your data is in, you know exactly what state your UI is in. Your UI is predictable. The React mantra is “re-render, don’t mutate.”

Re-rendering the entire DOM sounds expensive, but this is where React really shines. In order to draw your UI, it maintains a virtual DOM which can then be diffed. React’s diffing algorithm determines the minimum set of Components that need to be updated. It also batches reads and writes to the real DOM. This makes React fast.

Data is modeled two ways in a React component, props and state, which highlights the one-way data flow described earlier. Props consist of data that is passed from parent to child. A Component’s props can only be set by its parent. State, on the other hand, is an internal data structure that is accessed and modified only from within a Component. A Component is re-rendered when either its props or state is updated.

Once again, this makes it really easy to reason about your code (and unit test). Also note the use of the onClick handler. React provides a synthetic event system that gives you cross-browser compatible event listeners that you can attach to Components.

React and Backbone’s Router is a surprisingly powerful, yet effortless, combination for building single-page applications.

React makes it trivial to build small web apps, but because of its affinity for reusability and data modeling, it also scales well for large, complex UIs. You don’t have to use it for a new project, just start replacing small pieces of your UI with React Components. This makes it a lot easier for developers to adopt.

Real-Time Client Notifications Using Redis and Socket.IO

Backbone.js is great for building structured client-side applications. Its declarative event-handling makes it easy to listen for actions in the UI and keep your data model in sync, but what about changes that occur to your data model on the server? Coordinating user interfaces for data consistency isn’t a trivial problem. Take a simple example: users A and B are viewing the same data at the same time, while user A makes a change to that data. How do we propagate those changes to user B? Now, how do we do it at scale, say, several thousand concurrent users? What about external consumers of that data?

One of our products at WebFilings called for real-time notifications for a few reasons:

  1. We needed to keep users’ view of data consistent.
  2. We needed a mechanism that would alert users to changes in the web client (and allow them to subscribe/unsubscribe to certain events).
  3. We needed notifications to be easily consumable (beyond the scope of a web client, e.g. email alerts, monitoring services, etc.).

I worked on developing a pattern that would address each of these concerns while fitting within our platform’s ecosystem, giving us a path of least resistance with maximum payoff.

Polling sucks. Long-polling isn’t much better. Server-Sent Events are an improvement. They provide a less rich API than the WebSocket protocol, which supports bi-directional communication, but they do have some niceties like handling reconnects and operating over traditional HTTP. Socket.IO provides a nice wrapper around WebSockets while falling back to other transport methods when necessary. It has a rich API with features like namespaces, multiplexing, and reconnects, but it’s built on Node.js, which means it doesn’t plug into our Python stack very easily.

The solution I decided on was a library called gevent-socketio, which is a Python implementation of the Socket.IO protocol built on gevent, making it incredibly simple to hook in to our existing Flask app. 

The gevent-socketio solution really only solves a small part of the overarching problem by providing a way to broadcast messages to clients. We still need a way to hook these messages in to our Backbone application and, more important, a way to publish and subscribe to events across threads and processes. The Socket.IO dispatcher is just one of potentially many consumers after all.

The other piece of the solution is to use Redis for its excellent pubsub capabilities. Redis allows us to publish and subscribe to messages from anywhere, even from different machines. Events that occur as a result of user actions, task queues, or cron jobs can all be captured and published to any interested parties as they happen. We’re already using Redis as a caching layer, so we get this for free. The overall architecture looks something like this:

pubsub

Let’s dive into the code.

Hooking gevent-socketio into our Flask app is pretty straightforward. We essentially just wrap it with a SocketIOServer.

The other piece is registering client subscribers for notifications:

NotificationsNamespace is a Socket.IO namespace we will use to broadcast notification messages. We use gevent-socketio’s BroadcastMixin to multicast messages to clients.

When a connection is received, we spawn a greenlet that listens for messages and broadcasts them to clients in the notifications namespace. We can then build a minimal API that can be used across our application to publish notifications.

Wiring notifications up to the UI is equally simple. To facilitate communication between our Backbone components while keeping them decoupled, we use an event-dispatcher pattern relying on Backbone.Events. The pattern looks something like this:

This pattern makes it trivial for us to allow our views, collections, and models to subscribe to our Socket.IO notifications because we just have to pump the messages into the dispatcher pipeline.

Now our UI components can subscribe and react to client- and server-side events as they see fit and in a completely decoupled fashion. This makes it very easy for us to ensure our client-side views and models are updated automatically while also letting other services consume these events.