Progress is an Illusion
We keep being told things are getting better. Compute is cheaper, models are smarter, everything is a service now. The graphs go up and to the right. But the story doesn’t hold up once you stop letting Silicon Valley narrate it for you.
Pick your metric, pick your conclusion#
The whole concept of progress depends on what you measure, and the choice of metric does all the heavy lifting.
GDP per capita? Sure, up. Biodiversity? Collapsed. Access to information? Technically unprecedented, but most of it is SEO spam and AI generated slop fest. The ability to actually own and control your devices? Significantly worse than it was twenty years ago. You used to be able to install whatever you wanted on a computer you bought. Now your phone ships with an app store that takes a 30% cut and an OS that phones home every few minutes, and if you want to change any of that you’re looking at a weekend project.
You can always find a number going up somewhere. That’s not progress. That’s just high dimensional data and a convenient choice of projection.
Wirth’s Law#
There’s an idea called Wirth’s Law that says software gets slower faster than hardware gets faster. Niklaus Wirth wrote about it in 1995 and it’s only gotten more true since. It sits alongside May’s Law, which says software efficiency halves every 18 months, perfectly canceling Moore’s Law. And then there’s Andy and Bill’s Law from the 90s PC era: “What Andy giveth, Bill taketh away.” Andy Grove makes faster chips, Bill Gates ships fatter Windows to eat all the gains.
This has been documented for thirty years and nothing has changed. Slack uses multiple gigabytes of RAM to send messages. VS Code is a text editor running on top of Chromium. Discord ships an entire browser engine to do what mIRC did in 2MB. We routinely ship 200MB apps to display text and images, tasks that a 2005 Nokia handled in kilobytes.
The hardware gets exponentially better. The software expands to consume all of it and then some. Net user experience improvement over a decade? Marginal at best, often negative. My ThinkPad with NixOS and Sway boots faster and uses less memory than most brand new laptops running Windows 11, because the bottleneck was never the hardware haha.
Jevons’ Paradox in Computing#
In 1865, William Stanley Jevons noticed that when the steam engine made coal usage more efficient, total coal consumption went up, not down. Cheaper per unit means more total usage. He wrote that it’s a confusion of ideas to suppose that economical use of fuel is the same as diminished consumption. The opposite is the truth.
This maps onto computing almost perfectly. Make compute cheaper per FLOP and total consumption explodes. AWS Lambda and serverless were supposed to reduce resource usage through more efficient abstractions. Instead they unlocked entirely new categories of workloads and total cloud spend crossed $720 billion. Make storage cheaper per gigabyte and we don’t store the same amount more efficiently. We just store everything forever and build data lakes nobody queries.
Data centers now account for a growing share of global energy use despite decades of per-unit efficiency gains. The AI training boom is accelerating this hard. Training a single large language model can consume as much energy as a small town uses in a year, and inference at scale is on track to dwarf even that.
We made compute cheaper so we’d use less of it. We use astronomically more. Progress.
The Open Web Got Captured#
I run NixOS. I use FOSS wherever I can. I think software freedom matters, not as ideology but as a practical requirement for controlling your own computing. And there’s a real argument that open source represents genuine progress. The Linux kernel is better than it was ten years ago. GCC and LLVM produce better code every release. Wayland is finally replacing X11 and it’s actually good.
But the broader trajectory is grim. We built the internet on open protocols. Email is federated by design. HTTP is an open standard. DNS is decentralized. And somehow almost nobody self hosts email anymore because Gmail’s aggressive spam filtering disproportionately blocks small mail servers, making it economically irrational to even try.
XMPP existed before WhatsApp and was technically superior in nearly every dimension: federated, extensible, encrypted, open. It lost because WhatsApp shipped a smoother UX and leveraged network effects. RSS was the open, decentralized feed protocol. It lost to algorithmic timelines optimized for engagement, which is a polite word for addiction.
The protocols improved. The power dynamics reconsolidated. Every generation of web technology produces a brief window of decentralization followed by corporate capture. Web 1.0 was open, then Google and Facebook ate it. Web 2.0 was supposed to be participatory, then it became five platforms. Web3 was supposed to fix all of this and instead just produced new rent seekers with worse UX.
I went through the whole degoogling process recently and it’s absurd how much work it takes to use a phone without handing your entire life to a single company. You shouldn’t need to “sideload” (sorry) F-Droid, manually configure DNS, and replace every default app just to have basic privacy on a device you paid for. The fact that this is a nontrivial multi-day project tells you everything about what “progress” in mobile computing actually optimized for. It wasn’t the user.
AI Is Eating Its Own Tail#
I work in ML and I’m not dismissing the engineering. The scaling laws are real. Emergent capabilities from next-token prediction are surprising and worth studying. There’s genuinely interesting work happening.
But the second-order effects are already visible. We built text generation at zero marginal cost and immediately flooded the internet with generated spam, fake articles, and astroturfed reviews. The quality of Google search results has visibly degraded because a significant chunk of the indexed web is now generated content optimized for ranking, not for being useful. The training data for future models is being actively poisoned by the output of current ones. This is called model collapse in the literature and nobody has a clean solution for it.
We automated code generation with Copilot and Cursor and the vibe-coding movement took off. A whole cohort of developers is now entering the workforce who can produce code but can’t debug it, because they never built the mental model of what the code is actually doing. They can prompt but they can’t reason about execution. Make coding more accessible and total code production explodes, but the average quality per line drops because the barrier to producing garbage fell faster than the barrier to producing anything good.
Meanwhile the infrastructure cost is staggering. Training runs require thousands of GPUs for months. Inference at scale requires data centers consuming city level power. The efficiency of individual operations improves constantly, and total energy consumption grows even faster. Jevons again. Always Jevons.
Maintenance vs. Progress#
The longer you work in software the more you realize that most of the valuable work isn’t building new things. It’s keeping existing things running. Patching, monitoring, upgrading, migrating, documenting. The stuff nobody writes blog posts about (except this one i guess lol).
Progress narratives ignore maintenance because it isn’t exciting. But maintenance is what actually determines whether a system serves its users over time. The entire internet runs on critical projects maintained by a handful of people who haven’t been paid in years. The xz backdoor in 2024 happened because a piece of compression infrastructure used across virtually every Linux distribution was maintained by one burned out person who got socially engineered. That’s not a progress story. That’s a structural vulnerability we keep ignoring because maintenance isn’t VC fundable and doesn’t make for good keynotes xD.
Log4Shell was the same pattern. A logging library that half the Java ecosystem depended on had a critical remote code execution vulnerability because nobody was paying attention to it. These aren’t isolated incidents. They’re the predictable result of an industry that celebrates creation and ignores upkeep.
What actually works#
I don’t think nihilism is useful. I’m still going to write code, build things, and care about doing it well. But I’ve stopped thinking of any of it as “progress” in some grand civilizational sense.
The better framing is maintenance and local improvement. This system has a problem, I can fix it, so I will. These users need this capability, I can build it, so I will. Not because it contributes to some trajectory but because it’s the right thing to do with the skills and time I have right now.
The tech industry’s obsession with progress is really an obsession with narrative. It’s the story you tell investors and the story you tell yourself so that building another SaaS product feels meaningful. You don’t need a grand narrative. You just need to build things that work, for people who need them, and maintain them honestly.
The treadmill doesn’t have a destination. That’s fine. You can still do good work on it.