The Token Economics of Software Development
“Tokens” are everywhere these days. When people say “tokens”, they’re usually talking about one of two things, and proponents of each might call the other a scam. For our purposes here, we’re not talking about crypto. It’s hard to piece everything together, so I’m going to take a stab at it for software development - after all, “software has eaten the world.” But please take this analysis with a huge grain of salt.
Valuing Human-Generated Code
The key asset in software development is source code. Engineers write it, build systems with it, and run those systems on computers to drive the physical world. To measure this value, we can use an engineer’s salary as a proxy. It’s an imperfect measure, to be sure, but it’s statistically informative. Not all code is created equal. The essential code, in my view, is what implements core functionality and requires a developer’s full attention. Tests and generated code are less so. While there are many ways to measure source code, one of the simplest is lines of code (LOC). Again, it’s a rough metric, but it’s still valuable. For reference, the Linux kernel has about 35M lines of code, and about 10M excluding drivers (which is usually considered less “essential”).
Now for the confusing part: tokens. What is a token in AI? Roughly, it’s a common piece of information. In natural language, it’s typically a word or, in some languages, a common phrase.
Let’s connect these ideas with some anecdotal numbers. Salaries vary wildly, but in the US, let’s start with $100,000 USD per year. A productive engineer might produce around 10,000 lines of essential code annually (this is final output, not including refactoring or bug fixes). A line of code averages about 10 tokens. This brings us to a simple, back-of-the-napkin equation for human-driven development: $100,000 USD = 100,000 tokens.
Valuing AI-Generated Code
Now, let’s bring in AI coding agents. To generate that same 10,000 lines of essential code (100,000 tokens), an LLM’s raw output might be 10x that amount, since not all of it will be perfect code. The input tokens required to prompt the model could be another 10x on top of that. So, we’re talking about roughly 1 million output tokens and 10 million input tokens. Using a hypothetical pricing model - say, $10 per 1 million output tokens, with input tokens costing a tenth of that - the entire task would cost about $20. This figure could go much higher depending on the complexity, of course. These numbers are highly variable and inaccurate, but the direction they point is sound. The cost difference is vast. On this scale, human labor is several thousand times more expensive for a well-defined task. Even as a rough estimate, the potential impact on software development is enormous. And while an LLM’s performance may drop as a codebase gets larger, this isn’t a new problem. As described in The Mythical Man-Month, human collaboration suffers from similar scaling issues, making this a universal challenge anyway.
The Impact on Productivity and the Future
But the equation isn’t just about cost; it’s also about speed. With an LLM’s capacity, the time to build and deliver products can be drastically compressed, opening up a frontier of new opportunities. Just look at how quickly new models are being released and updated, or Cursor’s daily updates. To offer a personal example, I recently produced about 2,000 lines of code in a single day. Since 10,000 lines a year averages out to about 50 lines per workday, that’s a 40x productivity boost. When time compresses that dramatically, it changes your perception of what’s possible. The writing is on the wall, especially with recent releases like Codex CLI and Sonnet 4.5. This doesn’t mean the software engineer is obsolete over night - these agents still need a human driver. But it does mean the entire software development lifecycle, and the industry with it, is about to be fundamentally changed. I once heard that the fastest human can run about as fast as a raccoon. It feels like we are all raccoons at the keyboards now.