Reflections on software performance

This feels relevant to reinventions that attempt to make the web faster (like AMP) vs. building in a leaner, more purposeful way with the tools we already have which have been optimized for performance gains.

There’s a general observation here: attempts to add performance to a slow system often add complexity, in the form of complex caching, distributed systems, or additional bookkeeping for fine-grainedincremental recomputation. These features add complexity and new classes of bugs, and also add overhead and make straight-line performance even worse, further increasing the problem.

When a tool is fast in the first place, these additional layers may be unnecessary to achieve acceptable overall performance, resulting in a system that is in net much simpler for a given level of performance.

That kind of perfectly describes Google’s AMP, does it not? It was an attempt to make the web faster, not by encouraging the proper use of web technologies already available but by reinventing the wheel, even to the point of changing URLs.