This entry is a sort of “part 2” to my previous entry React Demystified. Now that I understand React better, I want to take a closer look at the blog post that motivated this quest to begin with, The Future of JavaScript MVC Frameworks. That entry advocates a vision for what MVC’s of the future will look like in JavaScript, and is accompanied by some strong benchmark numbers.

The article’s overall argument is that the design/architecture of most JavaScript MVC libraries today makes them slow. While they can be optimized to improve performance, a much better approach (according to the article) is to change the design of the MVC library to something that is inherently faster; a design where the optimizations fall out of the design “for free”. A design that is fast by default. This is the claim that I want to take a deeper look at and evaluate in detail.

I want to go deeper because the form of the argument – the idea that a fundamentally different design can render a lot of busy-work obsolete – is one that resonates with me. But there are a lot of aspects of this that are all mashed together a bit in the article, so I want to break them apart and look at them one by one.

React/Om vs. Backbone.js benchmarks

The article begins with some benchmarks from TodoMVC, a very cool web site that creates the same little TODO application in a ton of different JavaScript MVC frameworks. This is not only a great body of example code, but it has benchmarks for doing performance comparisons between the frameworks.

The article starts by noting that the React/Om library (which the author also wrote) is 2-4x faster than Backbone in “Benchmark 1” of TodoMVC, and about 800x faster than Backbone for Benchmark 2. This is followed with a profiling graph that shows Backbone making a ton of short-running function calls, where React/Om make many fewer calls that run longer.

From what I can tell, it appears that these benchmark numbers are mainly illustrating the difference between doing direct DOM updates (as in the Backbone example) and using a library like React that batches DOM updates. I suspect that these benchmarks have little to do with Om and everything to do with React.

In particular, “Benchmark 2” (the one with the 800x speedup vs Backbone) is effectively a no-op on the DOM, so none of Om’s custom shouldComponentUpdate() optimizations are coming into play here. The only thing that Om seems to be possibly contributing performance-wise to this benchmark is that it uses requestAnimationFrame() instead of the default React batching strategy, but this can be done easily with plain React too: here is a github project to do it. Unfortunately the React example for TodoMVC doesn’t implement the benchmarks, but if it did I suspect the performance would be almost identical to the React/Om numbers.

The author addresses this point a moment later:

Of course you can use Backbone.js or your favorite JS MVC with React, and that’s a great combo that delivers a lot of value. However, I’ll go out on a limb and say I simply don’t believe in event-oriented MVC systems - the flame graph above says it all. Decoupling the models from the views is only the first important step.

The argument here is that even if you use React with Backbone, Backbone is still based around change events (ie. a “push model”). Using Backbone together with React means calling React’s forceUpdate() whenever a Backbone change handler fires. The author argues that the flame graph from before illustrates that making lots of little function calls whenever there is a change event is slow.

I’m not sure I buy this argument. The flame graph from before is significant because it shows lots of DOM updates. It illustrates that Backbone performs all of its DOM updates eagerly, and DOM updates are known to be one of the biggest application bottlenecks in JavaScript applications. React is fast because it aggressively batches and minimizes its DOM updates. Having Backbone fire a bunch of change handlers that call React’s forceUpdate() function a lot isn’t slow, because React will still wait a while to actually re-render everything and update the DOM.

Om and Immutable Data Structures

The next section describes a lot of points that are more Om-specific. Om is an immutable data structure library: a structure is never mutated once created, which means that the contents of the entire tree are captured in the identity of the object root. Or in simpler terms, if you are passed an object that is the same object (by identity) that you saw earlier, you can be assured that it hasn’t changed in the meantime. This pattern has a lot of nice properties, but also generates more garbage than mutable objects.

React/Om takes advantage of this by implementing React’s shouldComponentUpdate() call to optimize away render() calls (and related diff-ing) when the objects are the same. When objects are the same they are known to have the same value, which is how we know that the optimization is safe. This is particularly important for React/Om, because unlike React with mutability (ie. setState()) React/Om completely refreshes the tree from the root every time. So React/Om’s shouldComponentUpdate() will return true for the root and for any paths to nodes which have changed, but can prune away diffs for any subtrees that have not changed.

I think it’s worthwhile to note that React/Om does not necessarily do less work than if you used React with setState(). If you called setState() on an interior node of the virtual DOM, you would only trigger a re-render of that one component. But with React/Om, you will trigger a re-render of all components between the root and the changed component. The final DOM update will be the same in both cases, but React will have had to do more work to discover that. It’s a small amount more work (and in other cases the setState() approach will require more work, possibly a lot more if you’re careless), but it was an interesting and unexpected that the immutable approach isn’t a strict improvement over the mutable one.

The article goes on to describe several other benefits of the functional-style immutable data structures. The biggest one as I see it is “undo for free” – this is an undeniably powerful pattern of immutable data structures.

Conclusions

The article’s broad claims that current JavaScript MVCs are inherently slow made a big impression on me. But when I examined the article’s substance in more depth, I was not fully convinced of this. To me most of the claimed improvements boil down to: updating the DOM is slow, so use a framework that batches updates to it. React does this, and I have heard that Angular does batching to some extent too. After looking at this, I’m not convinced that immutable data structures are inherently better-performing in the browser than mutable ones. And it is hard to imagine convenient two-way data binding being built on top of immutable data structures.

That said, there is a lot to like about immutable data structures, particularly their snapshot/undo capabilities.