One thing to notice when studying the history of tech or the internet is the winner takes all trend that results in slowly making tech a higher and higher tower of complexity.
Here, I call tech calcification the observed principle that once a technology is sufficiently widespread, new capabilities of a technology are built on top of it instead of next to it. Here are a few examples:
- Everything is built on top of IPv4.
- Everything is built on top of TCP.
- Everything is built on top of HTTP.
- Everything is built on top of a client-server architecture.
- Everything is built on top of JSON.
- Everything is built on top of JavaScript.
- Everything is built on top of Linux.
- Everything is built on top of NATed networks.
- Everything is built on top of MCP…
When a technology becomes widespread, it naturally gathers enough momentum to attract more people to work with it. The community memetic effect (or network effect [sic]) of the internet is its main driving force. For example, when enough people use a web-browser daily it is only natural that the web becomes the main software distribution platform. It is expected that a simple document browser becomes a high performance graphics renderer.
The 80s was a booming period for operating systems, with OSs specialized for micro-computing, mainframe, workstations etc… Today the market is mostly consolidated in 4 main OS (Windows, Linux, MacOS, BSD), and those can be grouped only 2 categories: POSIX & Windows. And now, Windows is Linux … Whole billion dollar sub-industries today live by specializing in securing, micro-optimizing, and spreading best practices for those 4 OS.
There are other reasons beyond just the community effect. Once a technology is widespread enough, more eyes are present to (1) improve the said technology, (2) do security assessments (3) experiment and spread said best practices. While the path is chaotic, it generally results in technologies that are somewhat efficient, very versatile, and mostly secure. Though, champion of none.
A calcified technology is also very difficult to replace, too many people to convince, too many devices to replace (or keep backward compatibility), too many best practices to re-learn. QUIC protocol is a good example of a technology that while mostly superior to TCP, has difficulties replacing it due to how optimized TCP+TLS implementations are. From custom CPU instructions, to decade of research poured into testing all the limits of TCP, not to mention millions of routers, or high tower http server stacks.
While I wish the world would move on from some calcified technologies, that there was more OS diversity, more network protocol diversity, less cargo-cult in serialization patterns, I do value that having an imperfect building block that is set in calcium allows the larger community to innovate on top of it.
And no, AI will not solve this problem